UK train stations tested CCTV cameras with emotion recognition tech

Commuters in eight train stations around the UK were surveilled with AI-powered CCTV cameras, revealed a report from WIRED. Based on documents accessed through a Freedom of Information (FOI) request, the report stated that for the past two years, major train stations around the UK have been testing AI surveillance technology with CCTV cameras to alert staff of safety incidents.

According to the report, the cameras utilised object recognition, a type of machine learning that can identify items in video feeds. The documents described various possible use cases like detecting trespassing, counting people for crowd management, and detecting unusual behaviour like running, shouting, skateboarding and smoking.

Perhaps most worryingly, one of the use cases described is “Demographics,” where “potentially the customer emotion metric could be used to measure satisfaction.” The documents also suggested that the data could be utilised to increase advertising and retail revenue. This facility was provided by Amazon’s Rekognition image identification system.

According to WIRED, the images were captured when people crossed a “virtual tripwire” near ticket barriers and were sent to be analysed by Rekognition, which allows face and object analysis. Notably, the Station’s camera setup does not utilise Facial Recognition Technology (FRT).

Gregory Butler, the CEO of data analytics and computer vision company Purple Transform, which has been working on the trials, told WIRED that the emotion recognition capability was discontinued during the tests and that no images were stored when it was active.

According to the Internet Freedom Foundation, “Human emotions do not have simple mappings to their facial expressions across individuals and especially cross-culturally. Despite it being baseless and racist, technologies like emotion detection are popular because the spread of FRT makes the acquisition of large datasets of face images possible which is what emotion detection algorithms work on.”

Similarly, researcher Vidushi Marda argues, “Emotion recognition technology is based on a legacy of problematic and discredited science and exacerbates power differentials in multiple ways. No amount of careful data protection practices can legitimise its use.”

Previous articleHoax callers may face flying ban, aviation security regulator to propose action
Next articleThreats via email: Delhi Police plans to have bomb disposal, detection, dog squads in each district