Clearview AI was fined more than 7.5 million pounds (US $9.4 million) by the UK Information Commissioner's Office (ICO) for collecting people's images from the internet and social media sites without their knowledge or consent in order to create a global online database that could be used for facial recognition by law enforcement.
The ICO also ordered the company to stop scraping publicly available data on UK citizens from the internet and to delete any data that had already been scraped.
Clearview AI's practices were the subject of a joint investigation with the Office of the Australian Information Commissioner that ended in November.
The ICO proposed a £17 million (then $22.6 million) fine at the time, implying that, as with the British Airways and Marriott fine reductions, making representations with the UK regulator pays off. The official notices of enforcement and monetary penalties will be released later this week.
Clearview AI's service allows customers, such as the police, to upload a person's photo to the company's app and search its database of over 20 billion images for a match. The app then displays a list of images with similar characteristics to the photo, along with a link to the websites where the images were obtained.
Although the company no longer provides services to organizations in the United Kingdom, many of the images in the database are likely to be of British citizens, according to the ICO, and are accessible to customers in other countries. As a result, without their knowledge or consent, their personal information is collected and sold.
Clearview AI "not only enables identification of those people but effectively monitors their behavior and offers it as a commercial service," according to UK Information Commissioner John Edwards. " This is inexcusable."
"People expect that their personal information will be respected, regardless of where their data is used," Edwards continued. " That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity ."
The ICO found that Clearview AI violated UK data protection laws by failing to use citizens' data in a transparent, fair, and lawful manner. The company also failed to meet the higher data protection standards required for biometric data, which the European Union's General Data Protection Regulation (GDPR) classifies as "special category data," and did not have a process in place to prevent the data from being kept indefinitely.
When people asked if their information was in the database, Clearview AI allegedly asked for more personal information, including photos, to check. This tactic, according to the ICO, may have served as a deterrent to people who wanted to object to their data being collected and used.
Clearview AI claimed that its technology and intentions had been "misinterpreted," and that it had done nothing wrong.
"The decision to impose any fine is incorrect as a matter of law," Lee Wolosky, a partner at law firm Jenner and Block and a spokesman for Clearview AI, said in a statement. "Clearview AI is not subject to the ICO’s jurisdiction, and Clearview AI does no business in the U.K. at this time."
Clearview AI was fined 20 million euros (then $22 million) by the Italian regulator Garante in February for GDPR violations. The CNIL in France has also threatened to fine the company for its use of data from French citizens.
The Swedish Data Protection Authority fined the Swedish Police Authority 2.5 million Swedish Krona (then $300,000) in 2021 for using the technology illegally.
By fLEXI tEAM