Biometrics are a quick and easy way to determine identification, which is why fingerprints, faceprints, and even vocal cadence have become a popular way to provide an additional level of security both online and in secure facilities. But while biometric data can be useful for unlocking your smartphone or providing additional security at company headquarters, it is also particularly vulnerable to cyberattacks. If the data is compromised, it’s a lot easier to change a password than your fingerprints.
There are also concerns about abuses of privacy from both governments and Big Tech. For example, TikTok, a Chinese company, collects faceprints and voiceprints from its users. TikTok denies it shares this data with the Chinese government, but privacy and security for the social media app continue to be a consumer concern.
TikTok is not the only social media company that collects significant amounts of user information, of course, including biometric data. For over a decade, Meta (aka Facebook) used photos uploaded by users as part of a facial recognition program. Facebook used the data to help identify and tag other users, as well as advance its own AI research. Facebook shut down the program in 2021.
Texas Attorney General Ken Paxton, not satisfied with Facebook shutting down its facial recognition program, filed a lawsuit against Meta in Texas district court seeking billions in damages for violating Texas’ Capture or Use of Biometric Identifier Act (CUBI) and the Deceptive Trade Practices Act (DCTP).
In its complaint, Texas argues that Facebook captured biometric data without informed consent, disclosed that data to third parties without informed consent, and failed to delete the collected information within a reasonable time, as required under the Texas law. It is seeking $25,000 for each violation of CUBI and $10,000 for each violation of the DTPA, the maximum amount allowed by law. According to the complaint, 20.5 million Texans had a Facebook account in 2021, meaning Texas is seeking billions of dollars in statutory damages.
Biometric Privacy Laws
Texas is one of only a few states with a specific biometric privacy law. Illinois was the first state to pass such a law in 2008, called the Biometric Information Privacy Act (BIPA). Texas followed shortly thereafter, passing CUBI in 2009. But the two laws do have an important difference. BIPA gives Illinois citizens a private right of action for violations of BIPA, meaning Illinois residents can get $1,000 or $5,000 for every violation, depending on whether the violation involved negligence or was intentional. This provision makes BIPA one of the most consumer-friendly privacy laws in the U.S.
Unlike BIPA, CUBI does not have a private right of action, instead leaving enforcement in the hands of the Texas attorney general. Paxton’s recent lawsuit is the first time a Texas attorney general has alleged a violation of CUBI in court.
Previous Lawsuit Led to $650 Million Settlement
There is reason for Texas Attorney General Ken Paxton to believe he has a good case. Illinois residents filed a class-action lawsuit against Meta for its facial recognition program in 2016. Meta and the class reached a settlement agreement in 2021 for $650 million, although Meta has not admitted to violating any state or federal privacy laws.
In 2020, consumers also filed a federal lawsuit against TikTok for alleged violations of BIPA. That lawsuit led to a $92 million settlement.
Finally, in 2019, the Federal Trade Commission issued a $5 billion penalty against Facebook for misleading consumers about their ability to control their personal, private information. Facebook has already paid the fine. While Facebook is seeing decreased daily use, it still reported a net income of $10.29 billion in Q4 of 2021.
Texas, should its lawsuit prove successful, is likely hoping to receive a similarly hefty amount.
You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help
Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.