The camera looks at you more than once and can read your heart. It sounds a little creepy, but it’s an increasingly real reality in China. A new report by Article 19, a London-based international human rights agency, found that China has widely applied emotion recognition technology to maintain stability. Yet, in the absence of international standards and norms, what research has been done in China after obtaining the data? Emotion recognition, the “black technology” of the Chinese media, has once again raised concerns about the deterioration of human rights in China.
A camera hooked up to the “Alpha Eagle Eye System” can take one look at you and, in just three seconds, conclude whether you are a suspect intending to cause trouble or commit an illegal act.
In Chinese media reports, Alpha Eagle Eye is that powerful. It makes judgments from the small movements of people blinking their eyes, shrugging their shoulders to touching their noses. Between 2014 and 2015, the Yiwu train in Zhejiang province alone made 153 arrests of so-called “criminals” through this “AI 3.0” technology.
Shazeda Ahmed, a doctoral student at the University of California, Berkeley, who specializes in cybersecurity and policy in China, combed through official Chinese public documents and reports and found that local public security agencies in several provinces and municipalities have partnered with different emotion recognition AI companies or the security industry. In China, the icy AI technology has become a powerful tool in the official claim to fight crime.
“For public security departments in particular, emotion recognition technology has been used in three areas: including early warning, which is the detection of possible crimes, monitoring for key populations and the post-arrest interrogation process. The so-called ‘key populations,’ including people who have been released from prison or protesters expressing their opposition to the government, are monitored more intensively by this emotion recognition system as long as they are officially identified as destabilizing, in order to ensure that these key populations do not come out and resist.” Ahmed said in an exclusive interview with the station.
Surveillance plus literacy Official stability warning enforcement staged in “1984”?
China’s ubiquitous cameras collect information about the faces of people in public places, and Big Brother doesn’t just look at you, it reads you. Imagine smiling, energetic people shouting “Love the country, love the party” in Tiananmen Square, and emotion recognition technology that analyzes whether the people shouting the slogans are sincere, as if they are reenacting the plot of George Orwell’s 1984 novel.
Ahmed, who has lived in Beijing, said she was shocked that the data was not just available to the public security authorities in China.
In addition to the Xinjiang Uighurs, who have been a key population for Chinese officials to monitor, the report mentions that in 2019, China’s Sichuan Police Academy Journal also mentioned that it would study the application of “micro-expression recognition technology” to security enhancements on roads into Tibet. These practices, under the banner of technological law enforcement, have raised concerns about human rights.
“Not only face recognition, but Chinese companies are also developing multi-modal emotion recognition technology by integrating signals such as voice and heartbeat. But human emotions are sometimes complex, and all these variables may make the accuracy factor lower. Does this help law enforcement in a positive way?” She said with concern.
The top 10 Chinese companies in emotion recognition technology have “integrated multimodal emotion recognition technology” that integrates voice and body. The report estimates the potential market size at more than $14.6 billion. Ahmed believes that Chinese technology may go out through the Belt and Road, and Southeast Asian countries are potential markets.
Technology Reporting Digital Maple Bridge Experience
The report, titled “Emotional Tangle,” specifically addresses market and application research in China. In addition to suggesting that Ningbo, Zhejiang’s Alpha Eagle Eye plays an important role in China’s network format public security monitoring system “Snow Bright Project”, there are also companies such as Shenzhen‘s Kestronics, Beijing Yikai Technology, Hangzhou Zhongwei Electronics, Second Emotion Recognition Technology, RuiDu Technology, Shenzhen AnShiBao, Taikoo Computing, YunShiChuangZhi and LiWei, all of which are working in different provinces and cities, such as airports, customs, high-speed railway stations and other public places to take on the responsibility of maintaining stability, and become the leader of China’s security industry.
Ahmad described it as a recreation of the “Maple Bridge Experience” of Mao’s Time. However, in the digital age, it is not the neighborhood mothers who report, but the cameras and software with emotion recognition technology.
Both Ahmed and another author of the report, Article 19 senior researcher Vidushi Marda, emphasized that when exploring the impact of AI applications on social development, not all problems are caused by China, but the entire human society is facing the challenges of new technologies changing rapidly, international norms are missing, and a huge market like China is growing at an excessive speed, and they The larger the database they have, the more sophisticated the emotion recognition technology will be after the algorithm. This is a warning to all. They believe that some basic ethical principles should not be changed.
Emotion recognition technology in China is on the verge of overdrive
“The key is still transparency, including the right to know how to get the data and how to apply it. Existing emotion recognition technology may still end up being applied in everyday Life, but what kind of international rules do we need to have? In the spirit of existing international law, including freedom of expression, the right to privacy and anti-racism, I think that these basic principles are already applicable to emotion recognition technology right now.” Mada said.
In fact, the concept and idea of emotion recognition did not originate in China. Mada told the station that in the 1960s, when digital technology was not advanced, American psychologist Paul Ekman studied the subtle expressions of a person’s face to identify whether they were lying. His theories were indeed applied to the FBI’s later crime investigations.
“The truth is written on the face.” This classic line from Dr. Letterman, the protagonist of the then hugely popular 2009 American TV series Lie to Me, was based on Ackerman’s true story, and was a popular movie episode on the Chinese internet that year. Almost at the same time, the world also began to develop the application of artificial intelligence technology. The face recognition of cell phones can be said to be a milestone in biometric technology, originally intended to enhance security.
However, in these last five years, China is like accelerating overtaking, making the cold camera, into one Wrightman after another. But is technology more protective or destructive of humanity? It all depends on the government, and there are examples in India, China’s closest neighbor.
Motor, for example, “the export of emotion recognition technology to China is a Dutch company, China as a market is eager to have such technological development, it is not difficult to imagine at all. But I would say that in India, where I live, the Indian police recently had to apply emotion recognition technology in public places to prevent violence against women.”
She said international rulemaking involves two major international telecommunications organizations, including the International Telecommunications Union, of which China is a very active member, and the Institute of Electrical and Electronics Engineers (IEEE). Mada, who specializes in laws related to the application of artificial intelligence technology, is also a member of the United Nations Expert Group on Artificial Intelligence Development. In her report, she recommends that the international community should ban the continued development, sale, import and export of emotion recognition technology, as it is inconsistent with existing international human rights fundamentals, and that the Chinese government should also ban the continued development, sale and transfer of emotion recognition technology and enact appropriate legislation as soon as possible to provide relief mechanisms for individuals affected by emotion recognition technology.
The truth is not always written on the face. Madha believes that if this development continues, without the improvement of appropriate regulatory measures, people may change the way they express their emotions without changing the environment of monitoring, which would be a frightening development.
Recent Comments