Top Class Actions’s website and social media posts use affiliate links. If you make a purchase using such links, we may receive a commission, but it will not result in any additional charges to you. Please review our Affiliate Link Disclosure for more information.
The facial recognition software company Clearview AI says a class action lawsuit against it should be tossed because the company stopped doing the things it is accused of before the lawsuit was even filed.
Clearview AI petitioned the Illinois’ Cook County Court to dismiss the case Oct. 7, arguing, among other things, that it voluntarily changed where, how and for whom it collects biometric data so that its methods abide by the Illinois Biometric Information Privacy Act (BIPA).
The company was hit with a class action lawsuit in May by the American Civil Liberties Union, which accused it of violating the Illinois law by scraping, or capturing, more than 3 billion facial images from photographs posted on various websites without their owners consent and running them through its facial recognition software.
The software maps the faces, creating biometric data — measurements, angles, distinctive features — that is stored in a database. The end product has been described as a search engine for faces that is useful to law enforcement agencies, among others.
Clearview AI insists it adjusted its operations to stop collecting images from Illinois residents before the ACLU filed its class action lawsuit.
It also stopped running facial recognition scans on images collected from sources connected to Illinois and “has cancelled all contracts with private entities, including all entities in Illinois, and now licenses its technology only to government entities,” the company’s petition says.
The Illinois Biometric Information Privacy Act prohibits private entities from collecting biometric data from people without getting their informed, written consent first.
Those that collect it lawfully are required to have a written policy for the retention and destruction of that information and are barred from disclosing it without written consent, and from profiting from disclosure even if consent is obtained.
Since the only thing the ACLU’s class action lawsuit was asking for was that the court stop Clearview from violating the Illinois biometric privacy law, the whole case is moot, the company argues.
Lawyers for Clearview also argued the Illinois Biometric Information Privacy Act cannot be applied to activity that takes place outside Illinois, and all data collection done by Clearview takes place in New York.
The defendants are also claiming the act violates the First Amendment of the U.S. Constitution and the Illinois constitution, both of “which protect the collection and use of public photographs that appear on the Internet.”
The ACLU filed the class action lawsuit along with co-plaintiffs the Chicago Alliance Against Sexual Exploitation, the Sex Workers Outreach Project, the Illinois State Public Interest Research Group and Mujeres Latinas en Acción.
The motion to dismiss the case will be discussed at a hearing scheduled to take place Oct. 16, the court papers say.
The ACLU’s class action lawsuit is not the only one Clearview AI Inc. is defending itself against.
In March, two men filed a class action lawsuit against the company in federal court in California, alleging it scraped more than 3 billion images from the likes of Facebook, Instagram, Twitter and Google without the consent of their users and used its facial recognition software to create biometric information and sell it to third-party entities.
Those plaintiffs say Clearview AI violated, among other things, the California Consumer Privacy Act of 2018, which requires businesses that collect personal information to inform consumers what information is being gathered and why, before or at the time of the data collection.
That same month, an Illinois resident filed a federal class action lawsuit against Clearview AI over alleged violations of the same Illinois Biometric Information Privacy Act as is cited in the ACLU case.
The federal class action lawsuit was filed in U.S. District Court in New York by a woman who claims Clearview scraped her image from her Facebook account without her consent.
Have you ever had your image scraped from a website or been subjected to facial recognition software without your knowledge or consent? Tell us about it in the comment section below.
The American Civil Liberties Union and proposed Class Members are represented by Jay Edelson, Benjamin H. Richman, David I. Mindell and J. Eli Wade-Scott of Edelson PC; Rebecca K. Glenberg, Karen Sheley and Juan Caballero of the Roger Baldwin Foundation of ACLU Inc.; and Nathan Freed Wessler and Vera Eidelman of the American Civil Liberties Union Foundation.
The Clearview AI Facial Recognition Software Class Action Lawsuit is American Civil Liberties Union, et al. v. Clearview AI Inc., Case No. 2020-CH-04353, in the Circuit Court of Cook County, Illinois.
Read About More Class Action Lawsuits & Class Action Settlements:
Macy’s Faces Privacy Class Action Over Alleged Use of Clearview Facial Recognition Software
Noom Website Trackers Breach California Privacy Law, Class Action Lawsuit Claims
Uber Sexual Assault, Lyft Driver Rideshare App Lawsuit Investigation
3 thoughts onClearview AI Wants Facial Recognition Software Class Action Lawsuit Dismissed
I believe third parties and companies that employee and have “security” entry rules that take pictures of employees are selling that and have exploited it for various reasons for various “systems” effort with foreign and domestic parties.
Example is the security company contracted with State Farm Insurance.
I also believe McLean County and perhaps the State of Illinois has been using biometric images, data and other items and systems and artifacts (biometric) to track people and integrate with other systems.
I hope I am wrong but Salesforce, Taylor communications, State Farm and third parties do not honor privacy and buy and sell data to be “innovators” and surveil calling it “risk and compliance”
Its implementing social scoring like China and I believe they could be involved as well.
I believe third parties and companies that employee and have “security” entry rules that take pictures of employees are selling that and have exploited it for various reasons for various “systems” effort with foreign and domestic parties.
Example is the security company contracted with State Farm Insurance.
I also believe McLean County and perhaps the State of Illinois has been using biometric images, data and other items and systems and artifacts (biometric) to track people and integrate with other systems.
I hope I am wrong but Salesforce, Taylor communications, State Farm and third parties do not honor privacy and buy and sell data to be “innovators” and surveil calling it “risk and compliance”
Its implementing social scoring like China and I believe they could be involved as well.
Add me please!