People who use TikTok are getting checks worth $167 over data violations. Google and Snapchat could be next.

This week, TikTok users who created videos on the app before September 30, 2021 began receiving payments of between $27.84 and $167.04 following a $92 million class-action data privacy settlement with the social media platform.

One of the largest checks went to a short-term resident in Illinois. The state has strict biometric data laws and TikTok was sued for violating them by collecting and implementing facial recognition data about users without their consent.

The plaintiffs in the TikTok case are bringing a variety of claims in state and local courts to maximize the number of people who can see a payout. “There is no comparable federal law. But that’s not stopping the lawsuit, which asserted all these different claims, including violations of privacy and deceptive practices,” says Katrina Carroll, lead attorney and founding partner at Lynch Carpenter LLP.

The settlement covers up to 89 million people.

Plagued with criticism surrounding user privacy, TikTok has come under fire for both unauthorized in-app purchases and collecting and storing personal information of minors.

China-owned social media company ByteDance has been fined by the state of Illinois for violating the biometric data law. They’re just the latest tech company to come under fire for this practice.

An Illinois lawsuit, which involved over 1.4 million people, resulted in Facebook giving $650M in checks and virtual payments to its users up until $397 in May. The lawsuit alleged that Facebook took the facial recognition data from its users without giving them clear consent to use it for tagging friends and getting notified of new posts on their timeline.

With Google’s $100 million settlement on the way and 420,000 Illinois residents set to receive about $150 each, privacy lawsuits are expected to increase. This comes after a judge approved a $100 million settlement against Google.

In August, some Snapchat users received notice to submit a claim. The lawsuit is worth $35 million and falls within the same bracket as other similar lawsuits against companies such as Pret A Manger and Shutterfly in recent years.

Facial recognition features on social media can have complicated consequences. For instance, a system could learn to identify a particular person’s biases, including anger or fear, based on photos they’ve been tagged in.

One example of this would be Clearview AI, a New York-based software company. Their website claims that they scraped more than 20 billion photos from sites like Facebook, YouTube and Venmo to study body features and help law enforcement accurately identify suspects. They have begun using the data collected to reduce violence, theft and fraud.

In a recent court settlement, Clearview agreed to stop selling user data to both individuals and private businesses in the United States. The company will also be banned from doing any business in Illinois for the next five years.

The same month that Clearview was fined an amount equating to $8.66 million in the U.K., France and Italy’s data protection agencies each fined Clearview 20 million euros ($19.91 million) within the last year.

Experts are concerned, especially with Clearview and other similar businesses. In a May interview, Matthew Kugler, a privacy law professor at Northwestern University told CNBC Make It that these types of businesses might eliminate our sense of anonymity.

“With facial recognition, everyone knows what you’re doing every second of the day,” Kugler said. Easy-access facial recognition data could make it easier for people to harass their local barista, or jeopardize the lives and safety of domestic violence victims, sex workers and people in witness protection programs, he added.

In 2019, a study Kugler authored found that 70% of its participants cited discomfort with companies using facial recognition data to track their location and serve personalized ads.

Three states–Washington, Texas, and Illinois–have laws that limit the collection of private data. For example, people in those states can’t access our “face filter services” on Instagram or Facebook.

Similar laws are set to go into effect in 2020 in California, Colorado, Connecticut, Utah and Virginia.

Recent Articles

Related Stories

Stay on op - Ge the daily news in your inbox