Google is restricting ad targeting based on the age, gender, or interests of people under 18. At the same time, it will also turn off location history for users under the age of 18 globally. By doing so, Google explained that location history will remain off without the option to turn it off. Currently, children with supervised accounts do not have the option of turning location history on.
Additionally, Google is also introducing a new policy that enables individuals under the age of 18 or their parent or guardian, to request the removal of their images from Google Image results. While removing an image from search does not remove it from the web, Google's GM, kids and family, Mindy Brooks, explained in a post that the change will help give young people more control of their images online. Meanwhile, Google also plans to turn SafeSearch on for existing users under 18 and make this the default setting for teens creating new accounts. SafeSearch aims to filter out explicit results when enabled and according to Google, is already on by default for all signed-in users under 13 who have accounts managed by Family Link, which is a function that allows parents to set up supervised accounts for their children.
To further safeguard the online viewing experiences of minors, YouTube also plans to remove "overly commercial content" from YouTube Kids in the coming weeks. These include a video that only focuses on product packaging or directly encourages children to spend money. Meanwhile, building on efforts such as content ratings, and its "Teacher-approved apps" for quality kids content, Google is launching a new safety section that will let parents know which apps follow its Families policies. "Apps will be required to disclose how they use the data they collect in greater detail, making it easier for parents to decide if the app is right for their child before they download it," Brooks explained.
Online platforms have been under scrutiny over the safety and privacy of younger users. Facebook, for example, was urged by more than 40 US attorneys general in May this year to abandon plans to create an Instagram service for children under the age of 13. According to CNBC, the attorneys general cited "detrimental health effects of social media" on children as well as the tech giant's "reportedly chequered past of protecting children" on its platform as reasons. Despite the backlash, CNN reported later that Facebook remained steadfast in building an Instagram for kids under 13.
Nonetheless, Facebook decided to limit ad targeting for youths under 18 last month, stating in a blog post that Facebook and Instagram "weren’t designed for people under the age of 13". Hence, advertising targeting options such as those based on interests or activity on other sites will no longer be available to advertisers for those under 18. Advertisers will only be able to target ads to this age group based on their age, gender and location, Facebook said. Meanwhile, in 2019, YouTube announced it is ending targeted ads on videos that are watched by children. It was also reported to have committed to a "roadmap" and worked on auto-moderation tools.
Photo courtesy: 123RF
Google again accused of fixing ad prices with Facebook in new lawsuit
Facebook to restrict ad targeting for youths under 18
Facebook and Twitter clamp down on hate speech by removing millions of content
Facebook, Twitter and Alphabet claimed to be mulling HK exit due to evolving data protection laws