Partager:

JAKARTA - Clearview AI is expanding sales of its facial recognition software to companies. The sale was mainly directed to serve the police. The plan, however, invites scrutiny over how the startup is leveraging the billions of photos it takes from social media profiles.

The sale could be significant for Clearview, when it presented its program on Wednesday, May 24 at the Montgomery Summit investor conference in California. This sparked an emerging debate about the ethics of utilizing disputed data to design artificial intelligence systems such as facial recognition.

Clearview's use of publicly available photos to train its tools earned it high marks for accuracy. The UK and Italian governments themselves fined Clearview for violating privacy laws by collecting images online without consent, and the company this month settled with US rights activists over similar allegations.

Clearview has been especially helpful in helping police identify people through social media images, but the business is under threat due to an investigation into allegations that they violated regulations.

A dispute settlement with the American Civil Liberties Union has prohibited Clearview from providing social media capabilities to corporate clients.

Instead of online photo comparisons, this new private sector offering matches people with photo ID and other data that clients collect with the subject's consent. It is intended to verify identity for access to physical or digital spaces.

Vaale, an app-based lending startup in Colombia, said it had adopted Clearview to match selfies with user-uploaded ID photos.

Vaale will save about 20% in costs and gain accuracy and speed by replacing Amazon.com Inc's Acknowledgment service, said Vaale Chief Executive Santiago Tobón, as quoted by Reuters.

"We can't have duplicate accounts and we have to avoid scams," he said. "Without facial recognition, we can't get Vaale to work."

Amazon itself declined to comment on the report.

Clearview AI CEO Hoan Ton-That said US companies selling visitor management systems to schools had also signed up for their apps.

He said the customer photo database is kept for as long as they want and is not shared with others, or used to train Clearview's AI.

But the face matchers that Clearview sells to companies are trained to recognize photos on social media. A diverse collection of public images reduces racial bias and other drawbacks affecting rival systems that are constrained by smaller data sets.

"Why not have something more accurate that prevents any errors or issues?" said Ton-That.

Nathan Freed Wessler, an ACLU attorney involved in the union case against Clearview, said using unauthorized data was an inappropriate way to develop less biased algorithms.

"Regulators and others should have the right to force companies to stop algorithms that benefit from disputed data," Wessler said, noting that the recent settlement did not include such provisions for reasons he could not disclose.

"It's an important deterrent," he said. “When a company chooses to ignore legal protections for collecting data, they must run the risk that they will be held accountable.”


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)