Quantcast

Clearview targeted in new biometrics privacy class actions from ACLU, others; Suit also targets Clearview vendors

COOK COUNTY RECORD

Tuesday, December 24, 2024

Clearview targeted in new biometrics privacy class actions from ACLU, others; Suit also targets Clearview vendors

Lawsuits
Facial recognition

EFF Photos from San Francisco, United States / CC BY (https://creativecommons.org/licenses/by/2.0)

CHICAGO — Facial recognition data dealer Clearview AI and two distributors of its systems have been targeted by three new class action lawsuits, accusing the companies of all violating Illinois' biometrics privacy law, potentially placing the companies at risk of paying out big money judgments.

Based in New York, Clearview scrapes images from public websites to create facial recognition databases marketed nationwide to law enforcement agencies, banks and loss prevention specialists.

The most recent action was filed May 28 by the American Civil Liberties Union in partnership with Edelson P.C., a Chicago plaintiffs law firm known for filing technology law class actions. Other named plaintiffs in that action include the Chicago Alliance Against Sexual Exploitation, Sex Workers Outreach Project Chicago, Illinois State Public Interest Research Group and Mujeres Latinas en Acción.

Another Chicago law firm, Miller Shakman Levine & Feldman, filed two class actions a day earlier. The named plaintiffs are the same in both suits — Cook County residents Melissa Thornley, Deborah Benjamin-Koller and Josue Herrera. In one of the lawsuits, Clearview itself is named as a defendant, while the other takes aim at two other companies, CDW-Government and Wynndalco Enterprises, which are licensed to sell Clearview’s app and database in Illinois.

“Clearview has set out to do what many companies have intentionally avoided out of ethical concerns: create a mass database of billions of faceprints of people, including millions of Illinoisans, entirely unbeknownst to those people, and offer paid access to that database to private and governmental actors worldwide,” Edelson said in the ACLU suit. “According to news reports by February 2020, people associated with 2,228 companies, law enforcement agencies and other institutions had collectively performed nearly 500,000 searches of Clearview’s faceprint database.”

The suits detail the nature of BIPA as a statute based on informed consent, requiring companies to obtain written permission to collect information, from fingerprints to face scans, while informing people for what purpose and how long the data will be stored. The individual plaintiffs said Clearview used their photos posted to platforms like Facebook, Instagram, Twitter, LinkedIn, YouTube and Venmo, and alleged the same happened to “millions of other Illinois residents.”

In the suit against CDW-Government and Wynndalco, plaintiffs included allegations about how Clearview customers access the database through the company’s software.

“The app user in Illinois has uploaded a photograph of an individual the user is seeking to identify,” according to the complaint. “Clearview AI’s facial recognition technologies scanned the face in the uploaded photograph and converted the facial geometries of the individual pictured in the photograph into mathematical formulas or ‘vectors;’ Clearview AI’s technologies then identified the individual in the photograph by comparing the newly-created facial geometries with facial vectors of individuals whose photographs had previously been stored and converted to biometric form in Clearview AI’s database; the Clearview AI app then displayed all photographs in the Clearview AI’s database of the matched individual to the app user.”

Though the plaintiffs said they aren’t aware of a user uploading photos of them to the app for identification purposes, the complaint asserted “BIPA prohibits a private entity in possession of biometric identifiers or biometric information, from selling, leasing, trading or otherwise profiting in Illinois from a person’s biometric identifier or biometric information.”

ACLU and its joint defendants raised concerns about the identification of their clients, many of whom are protective of their privacy in order to avoid harassment, violence or discrimination based on their histories as sex workers or survivors of domestic violence or sexual assault. They want a judge to rule Clearview violated BIPA and to force destruction of all data improperly collected.

A big payday could also be in the cards for the plaintiffs, as well. 

The individual plaintiffs, also represented by Forde & O’Meara, of Chicago, and Silver Golub & Teitell, of Stamford, Conn., seek statutory damages from Clearview, CDW-Government and Wynndalco, as well as class certification and jury trials.

Under BIPA, plaintiffs can demand defendant companies pay damages of $1,000-$5,000 per violation. The law has been interpreted to define a violation as each time a biometric identifier, such as a "faceprint," is scanned, stored or distributed. Companies can also be penalized for failing to create or publicly post a policy concerning why the data has been scanned, and how it is being stored, shared and ultimately destroyed.

This could leave companies accused under BIPA on the hook for many millions, if not billions, of dollars in damages. Facebook, for instance, agreed to pay $550 million to settle a class action, also led by Edelson, accusing it of violating BIPA in the way its photo tagging system operated.

More News