Passengers on trains across the UK were unknowingly monitored by AI cameras that recorded their emotions, such as happiness or sadness, using Amazon Rekognition software. This surveillance was part of a trial conducted by Network Rail, which began in 2022 at major stations like London Euston, Glasgow, Leeds, and Reading. The system also collected demographic information, including gender and age range, ostensibly to measure passenger satisfaction and optimize advertising and retail strategies.
The trial, which integrated AI technology to address issues like trespassing and overcrowding, has faced criticism from civil liberties groups. Big Brother Watch obtained documents revealing that the emotion recognition technology used was discredited, raising concerns about the invasion of privacy. The group has lodged a formal complaint with the Information Commissioner, criticizing Network Rail for deploying such technology without public consent and potentially exposing personal data to advertisers.
Network Rail defended the use of advanced technologies, emphasizing their commitment to passenger safety and compliance with legal standards. A spokesperson noted that all technology deployment is coordinated with law enforcement and adheres to relevant legislation. However, the trial’s emotional and demographic data collection has been discontinued, reflecting the controversy and concerns raised about privacy.
This incident underscores the broader debate on the use of AI surveillance in public spaces. While technology can enhance safety, the lack of transparency and public discussion regarding its application highlights the need for careful consideration of privacy implications and ethical practices in surveillance.
Reference: