The Facial Recognition System Amazon Sells to Cops Can Now Detect ‘Fear’

August 13, 2019 Off By Janus Rose

Amazon has faced public outrage for providing cloud services to the U.S. government, including law enforcement agencies that conduct mass-raids and separate families at the southern border. Now, Amazon Web Services (AWS) has rolled out more terrifying features for its cloud-based facial recognition system—including the ability to detect fear.

“Amazon Rekognition provides a comprehensive set of face detection, analysis, and recognition features for image and video analysis,” a blog post announcing the new features reads. “Face analysis generates metadata about detected faces in the form of gender, age range, emotions,” and other attributes such as whether the subject is smiling.

Emotion recognition is a facial analysis technique that has been marketed by private companies like Affectiva, Kairos, and Amazon. It works by training a machine learning system to look for certain features on a detected face which indicate emotional content. For example, a raised brow could indicate concern or bewilderment, while a downturned mouth could show feelings of repulsion.

The AWS post reveals that Amazon has updated the range of detectable emotions for Rekognition’s face analysis to include “fear,” adding to a list of seven other emotional states: “Happy”, “Sad”, “Angry”, “Surprised”, “Disgusted”, “Calm”, and “Confused.” While emotion recognition systems are not new, activists say they are especially harmful in the hands of government agencies like Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP).

Last year, Amazon pitched its Rekognition system to ICE, triggering widespread backlash from human rights advocates and its own employees. In July, researchers discovered that ICE used a different facial recognition system to search through driver’s license databases in more than a dozen U.S. states.

“Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities,” said Audrey Sasson, the Executive Director of Jews For Racial and Economic Justice, in an email to Motherboard. “[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps.”

The harmful nature of facial recognition and analysis in the hands of law enforcement has caused some cities to re-think whether the technology can be deployed ethically. San Francisco, CA and Somerville, MA have banned municipal use of face recognition, and a similar measure is being considered in Cambridge, MA this Fall.