Google attempts to secure minors’ privacy as it announces a series of new policies. The tech giant will now limit the search appearance of pictures of underage kids. This will enable kids and their parents to remove their photos from Google Image Results upon request. The new measures are likely to be put into effect in the coming weeks.
The changes have gained media attention as earlier; the company showed little interest in proactively working on kids’ safety. The recent events in Silicon Valley regarding child safety measures from companies were already making headlines.
Since removing images from the search results does not erase them from the internet, Google commented, “We believe this change will help give young people more control of their images online.”
The image removal policy follows with another restriction on ads that target kids based on their age, gender, and interests. After Facebook added features to curb child exploitation content, Google has gone a step ahead by attempting to block targeted advertising for kids.
YouTube adds more child-safety features
Additionally, YouTube, the entity under Google, announced that its default upload settings would be changed for minors to secure child safety. The new measures will adjust the settings to the most private options for users between the 13-18 age group. It will also turn off auto-play and induce digital well-being alerts.
Since the feature only controls unsolicited content to go on YouTube, it still leaves a choice to kids if they wish to make their uploads ‘public.’ For this, underage users will have to make a deliberate choice before uploading any video on YouTube.
Another policy announced by YouTube included the removal of content involving paid product placement or endorsement. “Content that is overly commercial or promotional is also not allowed in YouTube Kids,” Google stated, addressing new YouTube kids policies. It is said to be a result of constant pressure from child development experts and consumer advocacy groups.
Similar child safety measures are expected in the other concerned divisions of Google, including Google Assistant, Google Play Store and Location Histories.
At present, Google has a SafeSearch feature that allows users to filter out potentially offensive and inappropriate content. The feature will be included in the default settings of users for the age group below 13.
Owing to the increasing importance of anti-cyberbullying laws, companies’ child safety measures will be an important protocol for various states and governments in the coming years, suggests experts.