
After announcing that it would be rolling out new Sensitive Content Warnings last year, Google is now asking its three billion Android users whether they're going to opt in to the feature. You might remember the backlash when Google added photo scanning capabilities to Android, with many accusing the tech giant of 'secretly' monitoring us without permission.
Google defended its Android actions and said that SafetyCore app was there simply to classify and provide support for those who needed it.
Now, 9to5Google has confirmed how Sensitive Content Warnings will blur images related to nudity.
Advert
As well as blurring NSFW content, Android users will be warned about imagery being potentially harmful, while another feature allows you to block senders.
Sensitive Content Warnings are disabled by default, although they're on for under 18s. They're split into two categories, with it being on for supervised users and parents able to control it through the Family Link app.

For unsupervised teens aged 13 to 17, Sensitive Content Warnings can be turned off in the Google Account settings. NSFW images will be blurred and come with a warning that they "may contain nudity." You'll be able to delete before viewing if you want.
Advert
Options include "Learn why nude images can be harmful", "Block this number," and "Next > No, don’t view or Yes, view." Images can also be blurred back, and you can remove the preview. Users are reminded "of the risks of sending nude imagery and preventing accidental shares” before sending or forwarding something that Google deems to contain nudity. You have to confirm before sending.
With Google scanning photos, there are concerns about a potential invasion of privacy, although it's reiterated that SafetyCore "doesn’t send identifiable data or any of the classified content or results to Google servers."
The system currently only applies to photographs instead of videos as well, but if you want to enable it, it's relatively simple. Head to Messages Settings > Protection & Safety > Manage sensitive content warnings, go to the new settings page, and toggle "Warnings in Google Messages."
As pointed out by Forbes, GrapheneOS has some concerns, explaining how SafetyCore "provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users."
Advert
Still, it all seems pretty secretive, with them adding: "It’s unfortunate that it’s not open source and released as part of the Android Open Source Project and the models also aren’t open let alone open source… We’d have no problem with having local neural network features for users, but they’d have to be open source.”
Advert
When the idea was first announced, critics included Epic Games' Tim Sweeney, as well as Elon Musk. Musk wrote a simple "hmmm" when Sweeney accused Google of installing SafetyCore without users' permission.
There's been a recent discussion on protecting children online and on devices, with Pornhub being called into question, and while Google's actions seem commendable here, there's a vocal minority who claim the company is spying on us with its 'hidden' installations.