Google has begun a broad rollout of its new Sensitive Content Warnings in Messages for Android – a safety feature designed to automatically detect and blur nude images before you see or send them. The detection runs entirely on-device via Android’s System SafetyCore, meaning no identifiable data or the images themselves are sent to Google’s servers.
You’ll need to be signed in to your Google Account in Messages for the feature to work.
When a blurred image is flagged, you can choose to learn about the risks of explicit content, block the sender, view the image after confirming, or simply return to the chat without opening it. If you try to send or forward a nude photo, you’ll get a warning prompt and must confirm before it’s sent.
For adults (18+), the feature is off by default and can be switched on via Google Messages Settings > Protection & Safety > Manage sensitive content warnings > Warnings in Google Messages. Teen and child accounts have tighter controls: supervised accounts can’t disable it without parental approval via Family Link, while unsupervised teens aged 13–17 can turn it off in their account settings.
First announced in October 2024 and tested with beta users from April, the system is now available to everyone with the latest stable release of Google Messages and Play services. The concept is similar to Apple’s Communication Safety in iMessage, which blurs sexually explicit images for child accounts and offers safety resources. However, Google’s system applies to both adults and minors, with different default settings by age group.
Supporters say this could be a valuable tool to reduce unwanted exposure, especially for younger users, while privacy-conscious users may appreciate that all processing stays on-device. Critics, however, worry the pop-ups could be annoying in consensual adult conversations, and the opt-in requirement for adults might limit uptake. Whether it becomes a widely used safeguard or just another buried setting will depend on how users balance convenience, privacy, and safety.