Chinese tech giant Huawei worked with another technology company called Megvii on a facial recognition system to track and monitor Uyghur minorities, said a new report from IPVM, a leading video surveillance information source.
The report, citing an internal company document, elaborated that the Artificial Intelligence (AI) software could send automated "Uighur alarms" to government authorities as soon as its camera systems identified members of the minority group, which the Chinese government has repeatedly been accused of sinicizing against.
The findings were revealed on Tuesday by IPVM, the US-based research firm focusing on video surveillance analysis. The IPVM reports are reportedly based on a "confidential" report hosted publicly on Huawei's own European website.
The platform later shared the report with The Washington Post, which on Tuesday became the first media organisation to report on its content make the findings public.
Critics of China's treatment of Uyghurs have accused the Chinese government of propagating a policy of sinicization in Xinjiang in the 21st century, calling this policy an ethnocide or a cultural genocide of Uyghurs, with some activists and human rights experts calling it a genocide.
AI to determine "ethnicity"
The report said that Huawei would provide the "hardware" for the tests — including cameras, servers, and cloud infrastructure, while the "software" would be provided by Megvii.
It detailed how Huawei tested the Megvii face recognition on the company's video cloud infrastructure. According to the IPVM report, a feature called "Uyghur Alert" was tested on Huawei's hardware as part of the trial to check if it was compatible.
Additionally, another feature of the software also tested the possibility of determining "ethnicity" as part of its "face attribute analysis".
The system, designed to recognise people belonging to the Uyghur Muslim minority community and alert police, apparently "passed" in the test.
Huawei's response
In response to the IPVM report Huawei, however, reportedly said that the system "has not seen real-world application."
IPVM also said that the test mentioned in the Huawei internal report did not say where this software is deployed.
Megvii, in its response to IPVM, said that it "solutions are not designed or customised to target or label ethnic groups."
The company said that its "business is focused on the well-being and safety of individuals, not about monitoring any particular demographic groups."
However, tools which detect whether a face is Uyghur is popular with police in China, IPVM said in its report, adding that it identified a dozen departments deploying the tech.
The report comes at a time when China has been facing criticism from human rights activists around the world for its treatment of Uighur Muslim minority.
(With inputs from agencies)