TikTok fined US$5.7m for allegedly violating child privacy law

Video social networking app TikTok has agreed to pay US$5.7 million to settle US Federal Trade Commission (FTC) allegations that it illegally collected personal information from children. The violations were committed by the newly defunct Musical.ly app, which was folded under TikTok after it was was acquired by Chinese media firm ByteDance for around US$1 billion in 2017.

Filed by the Department of Justice on behalf of the Commission, the FTC's complaint alleges that TikTok violated the Children’s Online Privacy Protection Act (COPPA), which requires that websites and online services directed to children obtain parental consent before collecting personal information from children under the age of 13. According to the complaint, the app also violated the COPPA Rule when it failed to delete personal information at the request of parents.

In addition to the monetary payment, the settlement also requires the app’s operators to comply with COPPA going forward and to take offline all videos made by children under the age of 13.

Responding to the settlement agreement, TikTok said its priority is to create a safe and welcoming experience for all users and it has been committed to creating measures to further protect its user community. These include tools for parents to protect their teens and for users to enable additional privacy settings.

In conjunction with the settlement agreement and in working with the FTC, TikTok said in a press statement that it has now implemented changes to accommodate younger US users in a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience. Beginning today, the additional app experience now allows TikTok to split users into age-appropriate TikTok environments, in line with FTC guidance for mixed audience apps.

The new environment for younger users does not permit the sharing of personal information, and it puts extensive limitations on content and user interaction. Both current and new TikTok users will be directed to the age-appropriate app experience, according to TikTok.

"We care deeply about the safety and privacy of our users. This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this. We’re also working to bring our privacy and safety settings front and center for our users," the company added.

Previously, the defunct Musical.ly app enabled users to create short videos lip-syncing to music and share those videos with other users. Users were required to provide an email address, phone number, username, first and last name, a short biography, and a profile picture to register for the app. More than 200 million users downloaded the app worldwide since 2014, out of which 65 million accounts were registered in the US.

In addition to creating and sharing videos, the app allowed users to interact with other users by commenting on their videos and sending direct messages. User accounts were public by default, which meant that a child’s profile bio, username, picture, and videos could be seen by other users. According to the complaint, while the app allowed users to change their default setting from public to private so that only approved users could follow them, users’ profile pictures and bios remained public, and users could still send them direct messages.

FTC chairman Joe Simons said the operators of Musical.ly - now known as TikTok - knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13.

“This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law," he added.