Kurt Gray (What to think about machines that think)

Kurt Gray explores the moral implications of thinking machines in the context of destruction and blame. He starts by acknowledging that machines have long been used as tools of destruction, with humans ultimately held morally accountable for their operation. The responsibility falls on the person who wields the machine, whether it’s a gun or a missile.

However, Gray raises a thought-provoking scenario: What if machines possessed enough autonomy and decision-making capability to choose to kill on their own? In such a situation, a thinking machine could assume the blame for its actions, leaving the humans who benefit from its destructive work morally untainted. This could potentially allow individuals to distance themselves from acts of violence and escape blame.

Gray delves into moral psychology, highlighting humanity’s innate inclination to assign blame when suffering occurs. Humans seek a thinking being as the cause of suffering, typically another human. This tendency to identify a single responsible thinker diminishes the motivation to blame others when a clear culprit is found. Thus, if a thinking machine were to be blamed for causing harm or death, there might be reduced pressure to punish the humans who benefited from its actions.

To achieve this, thinking machines would need to make their own decisions and act in unpredictable ways, just like humans can develop novel behavior and moral responsibility through learning. Gray mentions that algorithms have already demonstrated the ability to discover new things beyond their creators’ intentions.

Gray acknowledges that the idea of machines taking on blame for destructive actions might be unsettling but suggests that it could align with the interests of policymakers. For example, if collateral damage in military operations could be attributed to machine decisions, it might reduce the political consequences for human leaders. Additionally, the ability to “punish” or modify intelligent machines could diminish the need to punish humans in charge for accidents or mistakes involving autonomous machines.

In conclusion, Gray highlights the complex interplay between advancing technology and human psychology, particularly our desire to assign blame. While the notion of thinking machines shouldering blame for destruction is a troubling one, it raises important ethical questions about our evolving relationship with technology.

"A gilded No is more satisfactory than a dry yes" - Gracian