Truth, Inspiration, Hope.

Chinese Authorities Deploy Emotion Detection Technology in Xinjiang: BBC

Jonathan Walker
Jonathan loves talking politics, economics and philosophy. He carries unique perspectives on everything making him a rather odd mix of liberal-conservative with a streak of independent Austrian thought.
Published: June 9, 2021
xinjiang_ethnic-harmony-chinese-communist-party-propaganda
People view a photo exhibition featuring images taken in China's northwestern Xinjiang region, organised by the China Photographers Association, in Beijing on March 24, 2021. (Image: NICOLAS ASFOURI/AFP via Getty Images)

Communist China has tested a camera system on Uyghurs that combines AI with facial recognition to identify the emotions of an individual, as reported by BBC. 

A software engineer who installed such systems in the Xinjiang region’s police stations revealed the information to the media outlet. The system was intended to make “pre-judgment without any credible evidence.”

The Xinjiang Uyghur Autonomous Region in far west China, home to the Uyghurs and other ethnicities, has been the subject of intense surveillance and mass incarceration under the Chinese Communist Party (CCP) in recent years. 

The engineer, whose identity BBC withheld, showed photographs of five detainees on whom the government had tested the AI emotion detection system which is said to indicate a person’s state of mind through various colors on a pie chart, with red indicating an ‘anxious’ or ‘negative’ state of mind.

The AI is trained to detect even the smallest changes in a person’s facial expressions. Beijing uses Uyghurs as test subjects for various experiments “just like rats are used in laboratories,” the man told BBC.

‘Like a lie detector’

The software engineer explained his role in installing the systems at police stations – “We placed the emotion detection camera 3m from the subject. It is similar to a lie detector but far more advanced technology… Your wrists are locked in place by metal restraints, and [the] same applies to your ankles.”

When BBC questioned the Chinese embassy in London on the issue, the latter stated that people live “in harmony” in Xinjiang and “enjoy a stable and peaceful life” without their personal freedoms constrained in any way.

At a high-level virtual event at the UN on May 12, Kenneth Roth, the Executive Director of Human Rights Watch (HRW), said that “more than one million Uyghur and other Turkic Muslims” have been detained by the communist regime who has forced them to renounce their religion and culture.

“A highly intrusive surveillance state has been established in Xinjiang to determine whom to detain,” he said while adding that there has been a “shocking reduction in Muslim birth rate in the region.” Compared to a slight increase in birth rates in Han Chinese areas, the Muslim birth rate has fallen by “a reported 48.74%.”

An AP report from June 2020 said that the Chinese Communist Party was taking “draconian measures” to cut down the birth rates of Uyghurs and other minorities, including subjecting women to pregnancy checks, forced sterilizations, and also abortions. 

“Birth rates in the mostly Uighur regions of Hotan and Kashgar plunged by more than 60% from 2015 to 2018… Across the Xinjiang region, birth rates continue to plummet, falling nearly 24% last year (2019) alone — compared to just 4.2% nationwide,” Associated Press stated.

Darren Byler, from the University of Colorado, said to BBC that Uyghurs are forced to regularly offer DNA samples to local officials. They are also made to download a government app on their smartphones that collects data from their devices, including text messages and contact lists.

High tech persecution

This photo taken on April 22, 2021 shows police officers and a staff member riding horses as they prepare to publicise laws and government policy to nomad herders in a remote area in Altay in China’s northwestern Xinjiang region. (Image: by STR/AFP via Getty Images)

“Uyghur life is now about generating data… Everyone knows that the smartphone is something you have to carry with you, and if you don’t carry it you can be detained, they know that you’re being tracked by it. And they feel like there’s no escape,” he said.

On Nov. 24, 2019, the International Consortium of Investigative Journalists (ICIJ) published classified documents that exposed the inner workings of the detention camps housing Uyghurs and other minority communities.

The classified intelligence briefings revealed Beijing’s use of AI-powered policing platforms that claimed to “predict crimes” based solely on computer-generated findings.

“The system is able to amass vast amounts of intimate personal data through warrantless manual searches, facial recognition cameras, and other means to identify candidates for detention, flagging for investigation hundreds of thousands merely for using certain popular mobile phone apps,” the ICIJ said.

In December last year, The Washington Post published an article accusing Chinese tech giant Huawei of testing software that identified Uyghurs via camera systems and sent an alarm to authorities.

Huawei worked with facial recognition startup Megvii on the project. The AI camera system was capable of scanning faces in a crowd and estimating an individual’s ethnicity, sex, and age.

John Honovich, the founder of IPVM, a Pennsylvania-based company that reviews and investigates video surveillance equipment, said that the issue was an example of how such discriminatory technology has become “totally normalized.”

“This is not one isolated company. This is systematic… A lot of thought went into making sure this ‘Uighur alarm’ works,” Honovich said.