ArtLung: Joe Crawford's personal website. 2024.

danah boyd, “moral crumple zones”

What danah boyd told us about AI and ethics by Alex Howard, reporting on a talk danah boyd gave a talk at Georgetown University’s Center For Digital Ethics Bracing for Impact: AI in the Wild. A quote from Alex’s notes:

danah gave us an important concept to consider for the rapidly expanding, emergent uses of AI, using the ways the USA regulates air travel as a useful prompt. When the ability to make decisions has shifted to a private industry without accountability, beware.

In airplanes, humans must remain in the loop, but sometimes the automated systems can get in their way, as with Boeing and the Max 8 disaster – long before the current rash of problems with the aerospace giant. danah noted that when it comes to Captain “Sully” and the Miracle on the Hudson, he has to make objections to save people’s lives.

Citing Madeline Elish, she talked about “moral crumple zones,” citing Madeleine Elish’s evocative concept that describe “how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system.”

When there’s no ability for pilots to override systems, it results in crashes. danah says the pattern is consistent: people would have been saved if pilots could override a technical system that wasn’t up to snuff. She’s concerned that we currently don’t have structures in place to hold tech companies accountable over time for similar issues.
The question of who gets to define acceptable outcomes in policy, programs, and services is always about power and whose values are reflected in them.

It sounds like it was great talk. I hope Georgetown is able to put it up somewhere.

The abstract for the “moral crumple zones” paper by Madeleine Clare Elish includes text to further articulate the concept: Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction:

Analyzing several high-profile accidents involving complex and automated socio-technical systems and the media coverage that surrounded them, I introduce the concept of a moral crumple zone to describe how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system. Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system may become simply a component–accidentally or intentionally–that bears the brunt of the moral and legal responsibilities when the overall system malfunctions.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.