Machines using AI to augment human perception is the coming shock, according to Ross Anderson. He highlights that for millions of years, humans and their rivals used similar machinery to perceive each other, but as computers become ubiquitous, the digital trail people leave behind can be analyzed by AI systems, potentially revealing information about race, intelligence, and sexual orientation from online behavior.
Anderson uses the analogy of animals evolving in a black and white world suddenly facing a predator that can see in color. He suggests that AI’s ability to analyze digital footprints challenges traditional forms of deception and camouflage.
While this is currently valuable for advertisers, Anderson raises concerns about the implications for privacy and power dynamics in society. He questions whether authorities will have exclusive access to enhanced cognitive systems or if they will be available to everyone. He envisions a future where augmented reality goggles are widespread, and people rely on AI systems as perceptual prostheses, but also highlights the potential for manipulation and control, as governments and advertisers influence what people see and perceive.
Anderson’s main point is that the integration of AI into human perception will transform how individuals and societies interact, raising important questions about trust, privacy, and power.