Researchers in Europe and the United States say Xinjiang‘s high-voltage surveillance system has been used in other parts of China, and big data “predictive warning” tactics have become a buzzword in the investigative and law enforcement propaganda of China’s public security departments, but raise controversial questions about their utility and ethics.
Surveillance spending across China rises 19-fold in a decade
The Chinese government’s surveillance of communications and the installation of surveillance equipment in Xinjiang, a web of surveillance technology aimed primarily at ethnic minorities such as the Uighurs, has generated global attention. Researchers in the United States and Europe have found that similar surveillance programs are widely used and diversified in many other parts of China outside of Xinjiang.
In “Minority Report,” a film based on the 1956 novel of the same name, police work with super-powered humans with prophetic powers to catch “criminals” before they commit their crimes, eliminating felonies from society altogether. With the help of surveillance technology and big data analytics, researchers say this practice, known as “predictive policing,” is being implemented by public security and government agencies across China.
A recent study co-sponsored by Jessica Batke, a former U.S. State Department analyst and senior editor of the online magazine ChinaFile, and Mareike Ohlberg, a senior researcher in the Asia Program at the German Marshall Fund, examines the impact of predictive policing on China’s security sector from 2004 to 2020. The study combed through more than 76,000 procurement announcements for surveillance equipment and services from government agencies across China from 2004 to 2020. Their study found that the number of procurement announcements for surveillance-related equipment and services increased nearly 19-fold from 2010 to 2019.
The study said 65.8 percent of surveillance purchases were made by the public security system, local government departments procured 15.8 percent of surveillance equipment and services, and the rest included local political and legal departments, urban management, stability maintenance units, local party branches and others.
The study also said that in 2019, one-third of China’s county and municipal government agencies (998) purchased additional surveillance equipment of some type. In the past five years alone, the surveillance campaign known as Operation Xueliang has cost more than 14 billion yuan ($2.1 billion), a figure that does not include spending on other surveillance programs.
Ma told VOA that while it is impossible to say whether public security officials across China are actively learning from the surveillance and law enforcement experiences of Xinjiang’s heavy-handed policies, their research shows that “surveillance programs in Xinjiang and many other parts of China are very similar in terms of their wording and the logic used to explain them.
Outside of Xinjiang, governments around the world have an eclectic mix of surveillance strategies. Among them, the practice of predictive policing is of interest. The goal of this law enforcement philosophy is to allow investigators to anticipate and stop crimes before they occur.
Big Data Predictive Early Warning, Minority Report to Become a Reality in China?
As early as 2016, Chinese President Xi Jinping gave instructions to “strengthen and innovate social governance,” with a special emphasis on improving the intelligence of social governance and calling for “improving the ability to predict and prevent various risks.
In 2018, Chinese Minister of Public Security Zhao Kezhi made deployment on the construction of “smart policing” and proposed the “big data strategy for public security”, saying that big data is “a new growth point for the generation of combat power” in public security work, and asked all places to “focus on the construction of smart policing”. “He called on all regions to “focus on improving the ability of prediction and early warning, precise combat and dynamic management”.
Ma Xiaoyue shared her research with Bartke in a webinar at the University of California, San Diego this week. She revealed that a tender document from the public security department in Harbin, Heilongjiang province, appears to reveal a deployment plan for China’s public security to use big data to predict and warn strategies.
The tender document, published in 2017 by the Harbin Public Security Bureau’s Xiangfang Branch with a budget of RMB 14.65 million, showed that the local public security department had entered people “involved in terrorism and violence” into a “key personnel database,” so that they could be used according to The “supervised learning” method “uses these known key persons as training samples and trains a classification or prediction model to classify and predict all the persons and find potential key persons involved in terrorism, explosions, etc.”
This document refers to this big data analysis function as the “prediction module for people involved in terrorism and riot”.
Limited role in punishing crime, mainly as a tool to maintain stability
Predictive policing is part of China’s “smart security” and “smart public security” social control system promoted by big data analysis, Cloud Computing and artificial intelligence tools. The Chinese government says that the goal of improving the “predictive warning and prevention” capabilities of public security is to increase the people’s sense of security. However, many analysts question the role of predictive policing in improving social security.
A 2020 article in the Nordic Journal of Law & Social Research (NAVEIÑ REET: Nordic Journal of Law & Social Research, NNJLSR) titled “Public Security’s Dream of Centralization – Predictive Policing in China. -Predictive Policing in China: An Authoritarian Dream of Public Security,” concludes that China’s predictive policing route is still facing the challenge of truly reducing crime rates if it is to At the same Time, the authorities emphasize that the real targets of “predictive policing” are dissidents, visitors, and some vulnerable groups.
Daniel Sprick, a researcher and lecturer at the Department of Chinese Law and Culture at the University of Cologne in Germany and author of the paper, told the Voice of America that crime statistics in China are often “skewed and biased,” making it difficult to determine whether predictive policing is helping society to improve security.
Sprick’s research focuses on China’s criminal law and justice system reform. He does not completely dismiss the role of China’s big data predictive warning strategies in crime prevention. For example, public security authorities can use “big data” to prevent telecom fraud.
AI algorithm models for anti-telecom fraud prediction and early warning, trained with big data, can predict in real time the relationship of people in the network of events, calculate the suspected victims, and proactively push early warning messages to them. In some Chinese media campaigns, there are often reports of police intercepting scam victims in time before they can go to the bank to transfer money to the scammer.
The Institute of Security Research of the China Academy of Information and Communications Technology released a compilation of cases on telecom network fraud prevention and management during the Epidemic in April 2020, citing 51 successful cases of anti-telecom fraud across the country. According to the report, a series of crackdown practices by public security officials led to a “significant drop in telecom fraud for many weeks in a row” during the epidemic.
However, Sprick of the University of Cologne said predictive policing may have some use in telecom fraud, a crime that primarily involves large amounts of online communication data, but once policing moves from online to offline, artificial intelligence predictive warnings are heavily biased and cannot weave an accurate picture of the crime.
“It’s a completely different story when it comes to street crime and crimes that simply don’t exist in the cyber world. You have to feed clumps of data collected on street corners into the machine. It’s hard because the generation of that data is always a product of bias, because even those who do the programming of the algorithms are biased …… because they’re using pre-existing data that was previously available and more or less influenced by that data.”
Sprick cites an expression from the data analytics industry that describes policing judgments made from inferior data as “garbage in, garbage out.” “That’s definitely a problem in China,” he said.
Sprick emphasized that the Chinese government’s main concern seems to be not to look at how to use big data technology to enhance its ability to solve real problems, but rather to enhance the positive image of the public security sector, and thus the legitimacy of the regime, by heavily promoting the investment in technology in security work.
The use of predictive policing is demonstrating that they [the government] are doing their best to produce a lot of results, to give the public a sense of security and that policing is effective using the most modern means accessible in the marketplace, so this predictive policing is very much a game in front of the curtain,” Sprick said. What’s really going on backstage? We don’t know.”
Experts criticize predictive policing practices as unethical
Predictive policing has also been widely criticized for its moral and ethical issues. Ann Cavoukian, a data privacy expert and executive director of the Global Privacy & Security by Design Centre in Canada, told Voice of America that predictive policing is “intolerable “.
Predictive policing is full of false alarms,” she said. It’s not a way to accurately detect anything.”
“It’s completely unethical because it’s not based on any reality, it’s just making assumptions about certain behaviors and then actually applying them to people.” Kavorkian said, “But you can’t do that. You need evidence, you need a rational basis …… When you have a rational basis, you take it to the judge. If you can convince the judge, then you can get a search warrant that allows you to go and gather information. But these are a series of steps that must be taken. Predictive policing doesn’t do that.”
There are also many ethical controversies in the West regarding the use of artificial intelligence in policing and investigative work. Civil rights activists argue that AI algorithms are trained to easily engage bias, allowing the machines to treat minority individuals differently and making them easier targets for law enforcement.
Sprick said he did not want to highlight any Western-Chinese dichotomy in the discussion of the ethics of predictive policing. But in particular, he noted what is lacking in China’s justice system: police are held far less accountable than in Western democracies, whether for machine error or human error.
China lacks a civil society that can really put enough pressure on the government to force it to hold the police accountable,” he says. Of course, the lack of a civil society that is antagonistic to the regime is an institutional problem in China.”
“That’s why China’s predictive policing more or less produces more false alarms that don’t lead to the kinds of scandals that would occur here [in the West].”
Sprick said, “There have been a lot of false alarms in predictive policing experiments in the U.S. as well, reigniting the whole issue of racial bias. But the U.S. has a strong civil society, with the National Association for the Advancement of Colored People (NAACP) playing front and center to provide support. China does not have such an association, and in particular is not strong enough to support those who are most likely to be victims of such misreporting – those who are already marginalized …… such as migrant workers, drug users and street vendors. It’s not that the Chinese don’t care, it’s that China lacks this built-in system of accountability.”
Such reports have been commonplace in recent years as Chinese police have expanded communications surveillance and increased crackdowns on visitors and dissidents at important domestic meetings and international events such as the G20 summit. Sprick said predictive policing may be an “effective” tool from the perspective of the authorities’ “stability maintenance” approach.
Recent Comments