- Share this article on Facebook
- Share this article on Twitter
- Share this article on Flipboard
- Share this article on Email
- Show additional share options
- Share this article on Linkedin
- Share this article on Pinit
- Share this article on Reddit
- Share this article on Tumblr
- Share this article on Whatsapp
- Share this article on Print
- Share this article on Comment
TikTok may be a welcome retreat for homebound users looking to entertain themselves during the novel coronavirus pandemic — but not everyone is happy with the video app. It’s facing multiple class action lawsuits from parents of underage TikTokers and a separate complaint from advocacy groups that it’s flouting an earlier agreement with the FTC.
A class action complaint filed in Illinois federal court on Wednesday by the guardian of a 17-year-old user claims TikTok uses facial scans in connection with its filters and effects — and fails to notify people that it captures and stores their biometric data in violation of the Illinois Biometric Information Privacy Act. (Facebook in January agreed to pay more than half a billion dollars to settle a class action alleging its suggested photo tags feature violated that law.)
Molly Janik is suing both TikTok and its parent, ByteDance, which also owns TikTok’s Chinese counterpart, Douyin, and operates several other internet businesses like content platform Toutiao.
In the past two weeks, half a dozen similar suits have been filed in Illinois and California federal courts, all by different law firms.
This is not the first time TikTok has come under fire for its data practices. The FTC fined the company a record-setting $5.7 million for violating U.S. children’s privacy laws. That fine was based around the data collection methods of Musical.ly, an app that ByteDance acquired in 2017 and shut down in 2018 after merging its audience with TikTok.
Janik’s suit nods to that settlement and notes that part of the solution was to implement “a feature that scans the user’s face to determine if he or she appears to be 13 years old or younger” using an algorithm.
“Additionally, many of TikTok’s features and video effects require scanning the user’s face to superimpose filters or effects over the user’s face,” states the complaint. “For example, by scanning the user’s face, TikTok can replace the user’s face with an emoji that moves the emoji’s mouth when the user talks and blinks when the user blinks, swap a user’s face with another user’s face, or place digital stickers over the user’s face that move when the user moves.”
Janik’s attorneys from Chicago-based Cafferty Clobes Meriwether & Sprengel argue that these features violate the BIPA because the company doesn’t obtain written consent from users to collect, use or store biometric information; doesn’t notify users it’s capturing this data; doesn’t have a publicly available policy regarding the purpose for collecting this information; and is allegedly sharing the biometric information with third parties without consent.
The proposed class in Janik’s suit is defined as: “All Illinois residents who created a TikTok or musical.ly account, or who used the TikTok or musical.ly applications, when 14 to 17 years old, and their parents and/or legal guardians.” The other actions use similar definitions, although multiple don’t mention age and instead focus on any Illinois residents who used the features at issue and had their biometric data captured.
Meanwhile on Thursday, a group of advocacy organizations, led by Center for Digital Democracy and Campaign for a Commercial Free Childhood, filed a complaint and request for investigation with the FTC. They claim TikTok is breaking the terms of its agreement with the agency and violating the Children’s Online Privacy Protection Act.
The groups claim that TikTok failed to destroy all personal information collected from users under 13 years of age and that the app currently has underage users whose parents have not provided consent. In fact, they argue that because TikTok doesn’t ask for contact information for a child’s parents, it has “no means of obtaining verifiable parental consent.”
In a statement to The Hollywood Reporter a TikTok spokeswoman said, “We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users.”
In December, TikTok launched a limited version of its app called TikTok for Younger Users, which doesn’t allow sharing of content, commenting on others’ videos or messaging other users, among other restrictions.
“TikTok’s conduct shows that it is continuing to pursue growth at the expense of endangering children,” states the complaint, which is posted below. The groups are asking the FTC to seek the maximum civil penalty allowed ($41,484 per violation) and notes Bytedance was recently estimated to be worth as much as $100 billion. “If the FTC does nothing to enforce its consent decree here, other companies that have entered into consent decrees with the FTC will feel free to renege on their agreements, putting even more children at risk.”
The already fast-growing TikTok has seen an explosion in downloads since the coronavirus pandemic forced people throughout the world to shelter at home. During the month of March, when many Americans began to isolate, the number of downloads grew 51 percent year-over-year to 199 million, per estimates from Sensor Tower. TikTok has now been downloaded more than 2 billion times. Monthly usage of the app in the U.S. — TikTok’s third-largest market — hit more than 14 hours on average, according to Comscore.
Sign up for THR news straight to your inbox every day