Meta Malaysia to roll out expanded teen safety features for Instagram
share on
Meta has expanded its Teen Accounts safety features for Instagram in Malaysia by introducing an age-appropriate 13+ content setting and enhanced parental controls for users under 18.
The rollout will take place over the coming month, introducing a 13+ content standard modelled on movie ratings to ensure age-appropriate content by default. The updated features will also add a stricter “Limited Content” option, enabling parents to further restrict what their children can view on the platform, Meta shared in a statement.
"Teens under 18 will be automatically placed into an updated 13+ setting, and they won’t be able to opt out without a parent’s permission," it said. The Teen Accounts were introduced in Malaysia last year in February.
Instagram in Malaysia already hides and avoids recommending sexually suggestive content, graphic or disturbing images, and adult content such as tobacco or alcohol sales to teens. Now, posts featuring strong language, risky stunts and other potentially harmful material can also be hidden or prevented from being recommended, to reduce the encouragement of dangerous behaviours.
Don't miss: Meta and YouTube lose key battle in social media addiction trial
Meta said it has also improved its technology to better identify and limit access to age-appropriate content across Instagram. Teens will no longer be able to follow accounts that regularly share age-inappropriate content, and those accounts will be restricted from interacting with teen users.
Search functions have been strengthened to block a broader range of sensitive topics or mature terms, including suicide, self-harm, eating disorders, alcohol and gore. Teens will also be prevented from accessing restricted content via recommendations, feeds, Stories or direct messages.
Clara Koh (pictured below), director of public policy, central Southeast Asia and ASEAN at Meta, said "At Meta, keeping teens safe online is our top priority. We understand that every family is different, which is why we are also introducing the 'Limited Content' setting, that empowers parents who prefer extra controls to further shape what their teen sees on Instagram."

According to a New Straits Times report, Koh also said the company is continuing “constructive conversations” with the Malaysian government about its plans.
This refers to Putrajaya’s proposal to ban under-16s from social media within the year, following a similar move by Australia. Koh said such bans could be counterproductive, noting that many youths in Australia have shifted to platforms not covered by the ban, which are more unregulated and potentially harmful.
She also urged governments worldwide, including Malaysia, to implement age verification at a single point and stage — specifically when users download applications via app stores.
Koh explained that working with app stores would allow age verification to be done once, when a parent gives their child their first phone and sets the child’s age. She added that this would provide a clear signal that data can be collected in one place and shared with a broader ecosystem of apps, arguing it would be risky for each platform to independently verify age and collect sensitive data due to privacy concerns.
This comes less than a month after Meta and YouTube, were found liable in a landmark social media addiction trial in the US. A Los Angeles jury handed a win to a young woman who sued the two companies over her childhood addiction to social media. Meta is the parent company of Instagram, WhatsApp and Facebook.
The tech giants were deemed by jurors to have intentionally built addictive platforms that harmed the 20-year-old’s mental health and hooked young users without sufficient concern for their wellbeing. She was awarded US$6 million in damages, with Meta liable for US$4.2 million and Google for US$1.8 million.
In January, nearly five million social media accounts linked to Australian teenagers were removed or restricted following the rollout of Australia’s under-16 ban, according to early government data.
Initial figures released overnight by the eSafety Commissioner show platforms restricted access to around 4.7 million accounts in the first weeks after the minimum age obligation took effect on December 10. The data marks the first official snapshot of compliance since the legislation came into force, barring children under 16 from holding accounts on major social platforms.
Related articles:
Malaysia tightens grip on deepfakes with social media enforcement
Under-16s to be banned from social media in 2026
Three months in, Australians say social media ban is working - but only just
share on
Free newsletter
Get the daily lowdown on Asia's top marketing stories.
We break down the big and messy topics of the day so you're updated on the most important developments in Asia's marketing development – for free.
subscribe now open in new window