Since 2008, class action litigation of Illinois’ Biometric Information Privacy Act (BIPA) has steadily grown, with high profile cases against companies as varied as Meta, Microsoft, BNSF Railway, and Johnson & Johnson. Several other states have biometric protection legislation on the books, but Illinois’ private right of action gives it serious bite – $1000 for each negligent violation or $5000 for each intentional or reckless violation, as well as reasonable attorneys’ fees and costs – making it uniquely successful and a model for upcoming legislation in multiple states in 2023.

BIPA cases involving biometrics other than facial recognition have been brought — including the first case not settled before trial, in which plaintiffs were awarded a $228 million judgment against BNSF Railway for fingerprint scanning without proper consent — cases based on facial recognition or manipulation of facial images have been particularly interesting. In part because of tension between how the lawmakers of 2008 envisioned that facial recognition would work and how the technology has evolved.

What is BIPA, What Isn’t

The plain text of BIPA defines two categories of biometric data that it regulates, biometric identifiers and biometric information, and also explicitly excludes various types of data from regulation. 

BIPA’s definition of biometric identifiers includes retina or iris scans, fingerprints, voice prints, or a scan of hand or face geometry. But it excludes many other potential identifiers such as photographs, physical descriptions, handwriting samples, and various types of data collected in a health care setting. 

The statute further defines biometric information as information derived from any of the above used to identify an individual. Crucially, information derived from any type of information excluded from being biometric identifiers, like photographs or medical samples, is explicitly excluded from being biometric information.  

When Is a Photograph More Than a Photograph?

The dichotomy between “scans of face geometry” as regulated biometric identifiers and photographs as non-regulated data at first glance makes sense. But modern facial recognition techniques generally do not depend upon or generate a scan of face geometry, instead using a type of deep learning called a convolutional neural network that is trained to recognize the whole face rather than extracted geometry [1]. This has led to confusion as use cases that surely fall under the intent of the law do not fall under its plain text. In Monroy v. Shutterfly, a Northern District of Illinois court opined that while data derived from a photograph could not be “biometric information,” it could be a “biometric identifier,” which could make the “scan of … face geometry” language moot in facial recognition cases or muddy the water further, but is in any case not a binding opinion.

So far, no facial recognition cases have found their way to a full trial, either settling soon after case certification or in some cases escaping from litigation for reasons unrelated to this core issue. In 2020, Meta settled Patel v. Facebook, a case alleging that Facebook recklessly violated BIPA with its Tag Suggestions facial recognition feature, for $650 million. On the other hand, Apple fared better with a finding that Touch ID and Face ID do not violate BIPA because the biometric data stay on the user’s device and therefore are not collected by Apple. Likewise, Microsoft won dismissal of a BIPA case based on the use of Microsoft’s Azure cloud platform to store biometric data by an Azure customer, Paychex. Paychex went on to settle the case for $3.4 million.

Woman using virtual try-on app with facial recognition. Will this violate biometric protection laws?

What’s That On Your Face?

Another category of facial software that has recently become the subject of BIPA litigation is skin care or cosmetic apps that allow consumers to “try on” the product virtually; a customer uploads a face photo to receive skin care recommendations or to see how a given cosmetic product will look [2]. 2022 saw cases launched against Estée Lauder and Johnson & Johnson for providing these applications without appropriate notice and consent. 

Facial recognition and virtual try-on (VTO) applications present different challenges for companies using these technologies. Unlike facial recognition techniques discussed above, VTO applications do use measurements of face geometry to construct the modified image in much the same way as digital special effects used in movies, and may fall under the strict language of the law even if the data was never collected to identify a person and may not even be suitable for that purpose. 

Looking Ahead

While only Illinois’ law has a private right of action, Texas and Washington also have biometric privacy laws, and biometric information is protected under the California Consumer Privacy Act. The ACLU estimates that 20% of states may have a biometric privacy law by the end of 2023, and many are modeled after Illinois law or ACLU model legislation that preserve the private right of action remedy. It remains to be seen how their definitions of biometric information will differ in their final forms. Meanwhile, as more novel applications find their way to BIPA litigation, the tension between what exactly is biometric data and what is a mere photograph will continue to present questions.


Biometric Privacy Laws Are Subject to Class Action

Experts at Quandary Peak Research have deep experience with privacy laws and facial recognition class action cases. Contact us today to recruit a BIPA expert witness in your case.