I've just given it the boot from my phone.
It doesn't appear to have been doing anything yet, but whatever.
This is a most excellent place for technology news and articles.
I've just given it the boot from my phone.
It doesn't appear to have been doing anything yet, but whatever.
More information: It's been rolling out to Android 9+ users since November 2024 as a high priority update. Some users are reporting it installs when on battery and off wifi, unlike most apps.
App description on Play store: SafetyCore is a Google system service for Android 9+ devices. It provides the underlying technology for features like the upcoming Sensitive Content Warnings feature in Google Messages that helps users protect themselves when receiving potentially unwanted content. While SafetyCore started rolling out last year, the Sensitive Content Warnings feature in Google Messages is a separate, optional feature and will begin its gradual rollout in 2025. The processing for the Sensitive Content Warnings feature is done on-device and all of the images or specific results and warnings are private to the user.
Description by google Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares. - https://9to5google.com/android-safetycore-app-what-is-it/
So looks like something that sends pictures from your messages (at least initially) to Google for an AI to check whether they're "sensitive". The app is 44mb, so too small to contain a useful ai and I don't think this could happen on-phone, so it must require sending your on-phone data to Google?
I guess the app then downloads the required models
Even with the latest update from Samsung, I am not seeing this app. My OnePlus did get it with the February update and I had to remove it.
Thanks. Just uninstalled. What a cunts
Do we have any proof of it doing anything bad?
Taking Google's description of what it is it seems like a good thing. Of course we should absolutely assume Google is lying and it actually does something nefarious, but we should get some proof before picking up the pitchforks.
Google is always 100% lying.
There are too many instances to list and I'm not spending 5 hours collecting examples for you.
They removed don't be evil long time ago
Fuck these cunt
My question is, does it install as a stand alone app? Or is it part of a Google Play update chunk that you only find out after Play has updated? My system does not auto update (by design) so I'd like to know where it sources from.
It didn't appear in my apps list so I thought it wasn't installed. But when I searched for the app name it appears. So be aware.
Hope they like all my dick pics
Don't worry they won't!
/Burn
Is there any indication that Apple is truly more secure and privacy conscious over Android? Im kinda tired of Google and their oversteps.
The countdown to Android's slow and painful death is already ticking for a while.
It has become over-engineered and no longer appealing from a developer's viewpoint.
I still write code for Android because my customers need it - will be needing for a while - but I've stopped writng code for Apple's i-things and I research alternatives for Android. Rolling my own environment with FOSS components on top of Raspbian looks feasible already. On robots and automation, I already use it.
Kind of weird that they are installing this dependency whether you will enable those planned scanning features or not. Here is an article mentioning that future feature Sensitive Content Warnings. It does sound kind of cool, less chance to accidentally send your dick pic to someone I guess.
Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares.
All of this happens on-device to protect your privacy and keep end-to-end encrypted message content private to only sender and recipient. Sensitive Content Warnings doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected. This feature is opt-in for adults, managed via Android Settings, and is opt-out for users under 18 years of age.
For those that have issues on Samsung devices: see here if you're getting the "App not installed as package conflicts with an existing package" error :
If you have a Samsung device - uninstall the app also from Knox Secure Folder. Entering to Secure Folder>Settings>Apps