Social Mixer 2024 Singapore
marketing interactive Content360 Singapore 2024 Content360 Singapore 2024
Apple delays launch of child safety feature

Apple delays launch of child safety feature

share on

 

Apple is delaying the launch of Child Sexual Abuse Material (CSAM) detection tools. According to the company, the delay comes as it is looking "to make improvements" following criticism from “customers, advocacy groups, researchers and others”

It told 9to5Mac in a statement that based on feedback from this group, the company “decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The roll out was announced last month where Apple said one of its goals is to create technology that empowers people and enriches their lives, while helping them stay safe. As part of the new features, the Messages app will use on-device machine learning to warn users about sensitive content. When receiving content deemed in violation, the photo will be blurred and the child will be warned. They will be presented with helpful resources, and will also be reassured it is okay if they do not want to view this photo.

apple2

As an additional precaution, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it. On-device machine learning will be used to analyse image attachments to determine if a photo is sexually explicit.

The new technology in iOS and iPadOS will also allow Apple to detect known CSAM images stored in iCloud Photos, and will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States. Instead of scanning images in the cloud, the system will perform an on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organisations.

apple1

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report. Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

Join our Digital Marketing Asia conference happening from 9 November 2021 - 25 November 2021 to learn about the upcoming trends and technologies in the world of digital. Check out the agenda here.

Related articles:

Apple asks for permission for personalised ads, loosens grip on in-app payment links

Apple's privacy focused feature will not be implemented in China

share on

Follow us on our Telegram channel for the latest updates in the marketing and advertising scene.
Follow

Free newsletter

Get the daily lowdown on Asia's top marketing stories.

We break down the big and messy topics of the day so you're updated on the most important developments in Asia's marketing development – for free.

subscribe now open in new window