Consumer Reports has no financial relationship with advertisers on this site.
One of the country’s foremost privacy laws will be put to the test in a new lawsuit involving a facial recognition company that claims to have collected billions of consumer photos without permission.
The complaint was filed against Clearview AI by the American Civil Liberties Union in Illinois, where the Biometric Information Privacy Act (BIPA) prohibits the gathering and use of facial recognition data without consent.
Clearview, a New York-based tech company that calls its product the “world’s best facial recognition technology combined with the world’s largest database of headshots,” says it has the legal right to gather and use the photos because they're public information.
Clearview gained national attention in January after a New York Times investigation documented how anyone with access to the company's mobile app could snap or upload a picture and identify an individual in the image with a high level of accuracy.
Clearview's services aren't available to the general public, and the company has said it plans to offer its product only to law enforcement officials going forward. But the ACLU's complaint says private companies and even private citizens have accessed the technology in the past. As of February, according to a report from Buzzfeed News, people associated with as many as 2,228 agencies, businesses, and organizations—including the Department of Justice, U.S. Immigration and Customs Enforcement, Macy’s, Walmart, and the NBA—have done almost 500,000 facial recognition searches using Clearview technology.
“The company has captured these faceprints in secret, without our knowledge, much less our consent, using everything from casual selfies to photos of birthday parties, college graduations, weddings, and so much more,” Nathan Freed Wessler, an ACLU staff attorney, wrote in a blog post about the case. “That company is Clearview AI, and it will end privacy as we know it if it isn’t stopped.”
To build its database, Clearview scraped pictures of people off social media services and other websites, using them to create a “faceprint,” or template, a digital model of facial features that can be matched to other photos in an instant. When the service finds a match, it identifies the person in question by linking to websites and information collected when the photos were scraped.
The company says that its system is no different from search engines like Google, and that its practices constitute free speech protected by the constitution.
“Clearview AI is a search engine that uses only publicly available images accessible on the internet,” Tor Ekeland, an attorney for Clearview, said in a statement emailed to CR by a company spokesperson. “It is absurd that the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment forbids this.”
Facebook, LinkedIn, Twitter, and YouTube have filed cease and desist orders against Clearview, but it's not clear where the law stands on data scraping. Tech companies often prohibit the repurposing of data that users post on their platforms in their terms of service. But recent court cases suggest they may not have the right to stop the kind of scraping Clearview does.
The ACLU, a longtime defender of free speech rights, dismissed Clearview’s First Amendment argument.
“Clearview is as free to look at online photos as anyone with an internet connection,” Wessler said in the blog post. “But what it can’t do is capture our faceprints—uniquely identifying biometrics—from those photos without consent. That’s not speech; it’s conduct that the state of Illinois has a strong interest in regulating in order to protect its residents against abuse.”
A Law That Gives You the Right to Sue
Whether or not Clearview has the right to collect the photos, Illinois law lays out specific guidelines about how facial recognition is governed in the state. BIPA requires companies to obtain consumers' explicit consent before collecting or sharing biometric information, the technical term for body measurements and other calculations that can be used to identify people via facial recognition, fingerprint scans, and other techniques.
Since BIPA went into effect in 2008, it has proved to be one of the country’s strongest privacy laws. Facebook recently agreed to pay $550 million to settle a class-action lawsuit alleging that the facial recognition features on its social media platform violated the law.
BIPA stands out among most privacy laws, one of the few that specifically addresses the use of facial recognition in the U.S. But what sets it apart in particular is that it gives consumers the right to sue a company for privacy violations, a legal concept called a “private right of action.” Without a private right of action, consumer protection laws can typically be enforced only by states attorneys general or law enforcement agencies.
“BIPA's been a very effective statute. Because there's private enforcement, we don't have to wait for an attorney general or other public official to stop violations like Clearview's,” said Justin Brookman, CR’s director of privacy and technology policy. “We don't have a lot of privacy law in this country, but the history of those laws shows that public enforcers move too slowly to keep up with the advances in technology.”
BIPA allows for up to $5,000 in damages for each “intentional or reckless” violation of the law. There haven’t been a lot of BIPA cases that have gone to trial, and so far it's unclear how violations are defined. Theoretically, fines could amount to billions of dollars if a court decided that a company violated the law every time it ran a facial recognition scan against a database of Illinois residents.
But the ACLU isn’t asking for a cash settlement in this case.
“This lawsuit seeks an end to Clearview’s privacy- and security-destroying practices, not damages,” Vera Eidelman, an ACLU staff attorney on the Clearview case, said in an email to Consumer Reports. "Much as Illinois residents are entitled to monetary damages, we hope our lawsuit will quickly put an end to Clearview’s unlawful conduct.”
The ACLU is bringing the case on behalf of several organizations that represent survivors of sexual assault and domestic violence, undocumented immigrants, and people in other vulnerable communities. Privacy experts argue that marginalized groups are disproportionately affected by the misuse of facial recognition technology.
Tests and experiments have shown that facial recognition algorithms can be less accurate when identifying people of color, particularly women. That raises questions about how law enforcement in particular is using the technology.
“We don't expect to be watched by the government 24/7 every second of our lives, and the Supreme Court has said that we retain privacy interests even when we're in public spaces,” Brookman said. “As technology gets more sophisticated, we're going to need new laws and policies to restrict both companies but also the government from tracking our every move.”
More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples
Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2020, Consumer Reports, Inc.