Brand safety concerns surround TikTok as online child predators remain on platform

TikTok has landed itself in a sticky situation again for hosting hundreds of sexual comments targeted at young children. The discovery was made after an investigation by BBC.

While BBC noted that the majority of the reported comments were deleted within 24 hours, it has found that the users who posted them were able to remain on the platform. This is despite the fact that the users are breaching TikTok's child infringement policies. Marketing has reached out to TikTok for more comments.

In a statement to Adweek, a TikTok spokesperson shared that it uses a mix of artificial intelligence and human moderators to review videos and accounts. To improve accuracy and efficiency, the company conducts internal training and implement enhancements, such as attaching videos next to comments and text classifiers, to help moderators recognise offensive behaviour. The spokesperson added that penalties range from restricting certain features to banning account access, depending on severity and frequency.

And the company pointed to safety features that have been added since the app’s debut, including the ability to restrict who can comment on posts and to filter keywords from comments. It also now gives parents the means of placing usage time limits on the app for their kids and setting their accounts to restricted viewing mode, which prevents them from seeing age-inappropriate content.

Stated on its website, the policies prohibit users from using "posts or private messages to harass underage users" and "any sexually explicit content featuring minors or content that sexually exploits minors." 

Additionally, any content about online dating with minors, compensated dating, invasion of a minor’s privacy, or other content that endangers minors’ physical and mental health is not allowed. TikTok added, "If we become aware of content that sexually exploits, targets, or endangers children, we may alertlaw enforcementor report cases, as appropriate."

Child predatory activities aside, BBC has found several accounts run by children under 13 even after the fine the company received for collecting data of underaged users in the US.

The new investigation comes shortly after the short-form video social platform was recently slapped with a hefty fine for illegally collecting personal data of users under the age of 13 in the United States. According to BBC and other media reports, since the ruling, TikTok has been asking users in the US to verify their age by proof of identification.

In Malaysia, TikTok also announced recently that it had upgraded the restricted mode feature in its app. Once enabled, this optional account setting will limit the appearance of content that may not be appropriate for all audiences. The feature is activated via password, which will be valid for 30 days. This tool, powered by machine learning algorithms, gives users more control over the content they watch, it said. TikTok has also appointed Innity, a digital media solutions and marketing technology company, as its official reseller. As part of the partnership, Innity will be able to assist its clients by engaging a younger group of audiences in Malaysia through a variety of advertising formats offered by TikTok,

Meanwhile, according to Reuters, a court ruling in India requested the government to ban the China-based app as it was "encouraging pornography."