Australians Protest Face Recognition Technology Applied In Pandemic Control, Here's Why
JAKARTA - Australia's two most populous states are piloting facial recognition software that will allow police to check people at home during the COVID-19 quarantine. The programme, expanding on trials that have sparked controversy for most of the country's population.
Little-known tech company Genvis Pty Ltd said on its website for its software that New South Wales (NSW) and Victoria, home to Sydney, Melbourne and more than half of Australia's 25 million residents, were testing facial recognition products. Genvis said the trial was conducted on a voluntary basis.
The Perth, Western Australia-based startup developed software in 2020 with the WA state police to help enforce movement restrictions during the pandemic. They also hope to be able to sell their services overseas.
The state of South Australia began piloting a similar non-Genvis technology last month, prompting warnings from privacy advocates around the world about the potential for excessive surveillance. The involvement of New South Wales and Victoria, which have not disclosed that they are piloting facial recognition technology, may reinforce those concerns.
NSW Prime Minister Gladys Berejiklian said in an email that the state was "close to starting some home quarantine options for returning Australians", without directly responding to questions about Genvis facial recognition software. Police in NSW four asked state prime ministers questions.
Victoria Police referred inquiries to the Victorian Health department, but the department did not respond to requests for comment.
Under the system being piloted, people respond to random check-in requests by taking 'selfies' at designated home quarantine addresses. If the software, which also collects location data, does not verify the image with a "signature", the police can follow up with a site visit to confirm the person's whereabouts.
While the technology has been used in WA since November 2020, it has recently been used again as a tool to allow the country to reopen its borders, and end a system in place since the start of the pandemic that required international arrivals to spend two weeks quarantined in hotels. who are under police custody.
In addition to the pandemic, police forces have expressed interest in using facial recognition software, prompting a backlash from rights groups about its potential to target minorities.
This facial recognition technology has also been used in countries such as China. However, no other democracy has reportedly considered its use in connection with coronavirus containment procedures.
Genvis Chief Executive Kirstin Butcher declined to comment on the trial, beyond disclosure on the product's website.
"You can't have a home quarantine without a compliance check, if you want to keep the community safe," he said in a telephone interview. "You can't do physical compliance checks at the scale needed to support a (social and economic) reopening plan so technology has to be used."
But human rights defenders warn that the technology may be inaccurate, and could open up “space” for law enforcement agencies to use people's data for other purposes without specific laws stopping it.
"I am disturbed not only by its use here, but also by the fact that this is an example of how this kind of technology can interfere with our lives (privacy)," said Toby Walsh, a professor of Artificial Intelligence at the University of NSW.
Walsh questioned the reliability of facial recognition technology in general, which he said could be hacked to provide fake location reports.
"Even if it works here ... then it validates the idea that facial recognition is a good thing," he said. "Where did it end?"
The Western Australian government said it was barring police from using data collected by COVID-related software for non-COVID matters. WA police say they have put 97,000 people through home quarantine, using facial recognition, without incident.
"The law must prevent quarantine monitoring systems from being used for other purposes," said Edward Santow, a former Australian Commissioner for Human Rights who now leads an artificial intelligence ethics project at the University of Technology, Sydney.
"Facial recognition technology may seem like a convenient way to monitor people in quarantine, but ... if something goes wrong with this technology, the risk of harm is high," said Santow.