this post was submitted on 12 Oct 2025
216 points (99.5% liked)

TechTakes

2232 readers
292 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
all 19 comments
sorted by: hot top controversial new old
[–] Soyweiser@awful.systems 13 points 10 hours ago

Somebody got fed up they were only evaluated on "%of people who adopted AI". This feels very malicious compliance to me.

[–] prism@lemmy.dbzer0.com 37 points 1 day ago

For a second I thought this was The Onion. This is so dark and invasive but I can't stop myself from laughing. It's like they don't even care to pretend anymore.

[–] ramble81@lemmy.zip 41 points 1 day ago (4 children)

Okay. So how often does it turn itself back on if ever?

[–] Treczoks@lemmy.world 8 points 12 hours ago

Whenever it is convenient for Microsoft, and when you least expect it.

[–] Truscape@lemmy.blahaj.zone 37 points 1 day ago

Given Microsoft's vision of user consent, after every update.

[–] Trebuchet@europe.pub 17 points 1 day ago

We don't know yet. My guess is something around 90 days so you're not able to fend off the beast forever.

[–] dgerard@awful.systems 6 points 1 day ago (1 children)

I guess install it and see!

[–] dalekcaan@feddit.nl 4 points 18 hours ago

I'm good, thanks

[–] xxce2AAb 52 points 1 day ago (1 children)

If you had told me ten years ago that Microsoft would one day become one of our best allies in the attempt to persuade people to use open source for their own damn good, I would probably have sarcastically replied something like "yeah and next you'll be telling me somehow Oracle will sour people on centralized social media too."

...Huh. What a weird timeline.

[–] HeyThisIsntTheYMCA@lemmy.world 3 points 4 hours ago

godsdamn fucking CERN weasels jumping us on the wrong timeline

[–] Truscape@lemmy.blahaj.zone 52 points 1 day ago

You can't make this shit up XD

[–] CinnasVerses@awful.systems 11 points 23 hours ago* (last edited 23 hours ago) (1 children)

I am told that Apple, DropBox, etc. have done this for years, often in the name of "fighting CSAM" or "helping you organize your photos". https://support.apple.com/en-us/108795 Agree that its a very good reason not to touch corporate cloud services and to not let people take digital photos of your face even if they promise not to share them! I do not trust any company with physical assets in the USA not to be penetrated by three-letter-organizations and data brokers.

[–] BlameTheAntifa@lemmy.world 6 points 18 hours ago (1 children)

Apple does not look at your data. Several years ago they announced plans to scan for “harmful content” and quickly abandoned it when watchdogs called them out on the plan being a horrendous privacy violation.

https://appleinsider.com/articles/23/08/31/apple-provides-detailed-reasoning-behind-abandoning-iphone-csam-detection

[–] Reach_the_man@awful.systems 9 points 12 hours ago (1 children)

didn't they recent-ish have a "oops, you weren't supposed to see that we're making backups of your deleted photos, sry not sry" incident?

[–] Soyweiser@awful.systems 5 points 10 hours ago

Dont know about that incident, but that is different than scanning for csam which is different than scanning for faces which is different from feeding your images into an genAI training set.

You could think that they are doing all this anyway (which I think the AI only companies do btw, vut doubt the bigger ones do, esp Apple).

[–] salacious_coaster@infosec.pub 18 points 1 day ago

Fuck everything about that.

[–] henfredemars@infosec.pub 17 points 1 day ago

Don’t trust Microsoft with your data. In general, big tech companies are not trustworthy. Store your own data as much as possible.

[–] Psythik@lemmy.world 2 points 21 hours ago

What if you don't have a camera?