close
close

TikTok denies violating Texas child safety law

TikTok denies violating Texas child safety law

TikTok disputes allegations in the Texas lawsuit that it does not do enough to protect the privacy of minors and give parents sufficient control over their children’s use of the platform.

Texas Attorney General Ken Paxton filed a lawsuit against TikTok on Thursday, October 3, claiming the social media giant violated a new state law, the Securing Children Online through Parental Empowerment Act (SCOPE), which is intended to protect children on the Internet from misuse of minors’ personal data.

At the heart of Paxton’s complaint is the allegation that TikTok’s current parental control features do not meet the requirements of the SCOPE Act.

“We strongly disagree with these claims,” a TikTok spokesperson said in an emailed statement to Twitter Newsweek.

“In fact, we provide robust protections for youth and parents, including Family Pairing, all of which are publicly available,” the statement said. “We stand by the protections we offer families.”

Newsweek contacted Attorney General Ken Paxton’s office for comment via email.

Paxton argues that the “family mating” system is inadequate. The Texas lawsuit claims that TikTok does not require “commercially reasonable methods” for parents to verify their identities, as set out in the law. Additionally, the lawsuit criticizes the requirement that minors consent to mating, which potentially poses a barrier to parental supervision.

Paxton further claims that TikTok engages in “unlawful sharing, disclosure and selling.” [of] Share “personal identity information of known minors” with various third parties, including advertisers and search engines. According to the lawsuit, this data was shared without obtaining the appropriate consent of verified parents, which is a direct violation of the provisions of the SCOPE Act.

The SCOPE Act, passed in 2023, took effect in part on September 1, 2024. The law introduces requirements for “digital service providers” – a broad term that includes websites, apps and software that collect personal data – particularly those that enable social interaction and content sharing.

A group of children using smartphones in a school hallway. TikTok has denied allegations from Texas that it does not adequately protect minors on its platform.

lakschmiprasad S/Getty Images

Paxton’s lawsuit claims that TikTok is “collecting.”[s]save[s]and process[es] Personal identification information about minors when a minor interacts with TikTok, adding that the data “includes users’ date of birth, email address, phone number and device settings such as device type, language setting and country setting, as well as data about a user’s interaction with TikTok, e.g. E.g. videos viewed, liked or shared, followed accounts, comments, content created, video captions, sounds and hashtags.”

The law also requires these platforms to obtain verifiable parental consent before sharing, disclosing, or selling a minor’s personal information. Additionally, these platforms must provide parents with tools to manage their children’s privacy and account settings, something Paxton claims TikTok does not do.

“I will continue to hold TikTok and other Big Tech companies accountable for exploiting Texas children and failing to prioritize the online safety and privacy of minors,” Attorney General Ken Paxton said in an official statement on October 3.

“Texas law requires social media companies to take steps to protect children online and requires them to provide parents with tools to do the same. TikTok and other social media companies cannot ignore their obligations under Texas law.”

Paxton’s lawsuit follows a separate legal challenge to the SCOPE Act. In August, just days before the law took effect, U.S. District Judge Robert Pitman issued a last-minute partial ban, preventing enforcement of SCOPE’s “monitoring and filtering” requirements related to filtering harmful content from feeds minors.

The judge expressed concerns about possible violations of the right to freedom of expression online, particularly given the broad language used to define harmful content, including terms such as “promoting,” “glorifying,” and “cultivating.” However, other aspects of the SCOPE Act remain in effect, such as those related to data sharing and parental control tools.

Related Post