Chinese Students Invent Coat That Makes People Invisible to AI Security Cameras
December 7, 2022To the naked eye, it looks like any other camouflage pattern coat. But to artificial intelligence security cameras, it’s an invisibility cloak that effectively conceals the person wearing it.
By day, the coat’s customized camouflage prints, designed through an algorithm, escape detection from visible light cameras. By night, when security cameras usually identify humans through infrared thermal imaging, the coat’s embedded thermal devices emit different temperatures—presenting an unusual heat pattern that allows the coat to fly under the radar.
Developed by a group of four graduate students from China’s Wuhan University, the InvisDefense coat was one of the projects that attained the first prize at the “Huawei Cup,” an inaugural cybersecurity innovation contest supported by Chinese technological giant Huawei.
“We spent a lot of energy preparing for this, including this product’s design and development,” Wei Hui, the computer science graduate student who designed the coat’s core algorithm, told VICE World News. He said that the InvisDefense coat presents a “novel” way of circumventing AI human detection technology used by existing security cameras.
When the students tested the coat on campus security cameras, the accuracy of pedestrian detection was reduced by 57 per cent. Researchers said that one of the main difficulties of developing the coat was striking a balance between fooling both the camera and the human eye.
“We had to use an algorithm to design a least conspicuous image that could render camera vision ineffective,” Wei said.
China boasts a notorious state-of-the-art state surveillance system that is known to infringe on the privacy of its citizens and target the regime’s political opponents. In 2019, the country was home to eight of the ten most surveilled cities in the world. Today, AI identification technologies are used by the government and companies alike, from identifying “suspicious” Muslims in Xinjiang to discouraging children from late-night gaming.
There has been limited pushback; in 2020, a law professor won a lawsuit against a zoo in Hangzhou for collecting visitors’ facial biometric data without their consent, in the country’s first-ever case challenging the use of facial recognition technology.
Similar privacy concerns were on the Wuhan University team’s mind when they designed the InvisDefense coat, which will be sold for about 500 yuan ($71).
“Security cameras using AI technology are everywhere. They pervade our lives,” said Wei. “Our privacy is exposed under machine vision.”
“We designed this product to counter malicious detection, to protect people’s privacy and safety in certain circumstances.”
According to Wei, the team’s future research plans include making other objects “invisible” to AI cameras—such as inanimate items and moving cars. They are also looking into circumventing other types of cameras, such as those that use remote sensing, satellites or aircraft.
But it appears that the researchers, who live in China, are not out to subvert the state’s sweeping surveillance system. In fact, according to the team, they are hoping to strengthen it.
“The fact that security cameras cannot detect the InvisDefense coat means that they are flawed,” said Wei. “We are also working on this project to stimulate the development of existing machine vision technology, because we’re basically finding loopholes.”