A company that gained notoriety for selling access to billions of facial photos, many culled from social media without the knowledge of the individuals depicted, faces major new restrictions to its controversial business model.
On Monday, Clearview AI agreed to settle a 2020 lawsuit from the ACLU that accused the company of running afoul of an Illinois law banning the use of individuals’ biometric data without consent.
That law, the Biometric Information Privacy Act (BIPA), protects the privacy of Illinois residents, but the Clearview settlement is a clear blueprint for how the law can be leveraged to bolster consumer protections on the national stage.
“By requiring Clearview to comply with Illinois’ pathbreaking biometric privacy law not just in the state, but across the country, this settlement demonstrates that strong privacy laws can provide real protections against abuse,” Deputy Director of ACLU’s Speech, Privacy, and Technology Project Nathan Freed Wessler said.
“Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profit. Other companies would be wise to take note, and other states should follow Illinois’ lead in enacting strong biometric privacy laws.”
Clearview isn’t the only company to get tangled up in the trailblazing Illinois privacy law. Last year, Facebook was ordered to pay $650 million for violating BIPA by automatically tagging people in photos with the use of facial recognition tech.
According to the terms of the Clearview settlement, which is still in the process of being finalized by the court, the company will be nationally banned from selling or giving away access to its facial recognition database to private companies and individuals.
While there is an exception made for government contractors — Clearview works with government agencies, including Homeland Security and the FBI in the US — the company can’t provide its software to any government contractors or state or local government entities in Illinois for five years.
Clearview will also be forced to maintain an opt-out system to allow any Illinois residents to block their likeness from the company’s facial search results, a mechanism it must spend $50,000 to advertise online. The company must also end its controversial practice of providing free trials to police officers if those individuals don’t get approval through their departments to test the software.
The sweeping restrictions may dampen Clearview’s ability to sell access to its software in the US, but the company is also facing privacy headwinds in its business abroad. Last November, Britain’s Information Commissioner’s Office hit Clearview with a $22.6 million fine for failing to obtain consent from British residents before sweeping their photos into its massive database. Clearview has also run afoul of privacy laws in Canada, France and Australia, with some countries ordering the company to delete all data that was obtained without their residents’ consent.
In a statement, Clearview’s legal team spun the settlement as a “huge win” for the company, claiming that its business will not be impacted and that Clearview is happy to end its legal battle with the ACLU. Clearview CEO Hoan Ton-That stated that the company plans to comply with BIPA by selling its algorithm — and not access to its database — to private companies in the US