this post was submitted on 19 May 2025
405 points (96.3% liked)

A Boring Dystopia

12190 readers
531 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--If a picture is just a screenshot of an article, link the article

--If a video's content isn't clear from title, write a short summary so people know what it's about.

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 2 years ago
MODERATORS
 

cross-posted from: https://hexbear.net/post/4958707

I find this bleak in ways it’s hard to even convey

you are viewing a single comment's thread
view the rest of the comments
[–] ininewcrow@lemmy.ca 107 points 1 day ago (5 children)

A human therapist might not or is less likely to share any personal details about your conversations with anyone.

An AI therapist will collate, collect, catalog, store and share every single personal detail about you with the company that owns the AI and share and sell all your data to the highest bidder.

[–] WR5@lemmy.world 2 points 8 hours ago

I'm not advocating for it, but it could be just locally run and therefore unable to share anything?

[–] DaddleDew@lemmy.world 62 points 1 day ago* (last edited 1 day ago) (1 children)

Neither would a human therapist be inclined to find the perfect way to use all this information to manipulate people while they are being at their weakest. Let alone do it to thousands, if not millions of them all at the same time.

They are also pushing for the idea of an AI "social circle" for increasingly socially isolated people through which world view and opinions can be bent to whatever whoever controls the AI desires.

To that we add the fact that we now know they've been experimenting with tweaking Grok to make it push all sorts of political opinions and conspiracy theories. And before that, they manipulated Twitter's algorithm to promote their political views.

Knowing all this, it becomes apparent that we are currently witnessing is a push for a whole new level of human mind manipulation and control experiment that will make the Cambridge Analytica scandal look like a fun joke.

Forget Neuralink. Musk already has a direct connection into the brains of many people.

[–] fullsquare@awful.systems 15 points 1 day ago

PSA that Nadella, Musk, saltman (and handful of other techfash) own dials that can bias their chatbots in any way they please. If you use chatbots for writing anything, they control how racist your output will be

[–] desktop_user@lemmy.blahaj.zone 1 points 1 day ago (1 children)

the AI therapist probably can't force you into a psych ward though, a human psychologist is obligated to (under the right conditions).

[–] Krauerking@lemy.lol 2 points 10 hours ago (1 children)

Who says that's not coming in the next paid service based on this great idea for chatbots to provide therapy to the abused masses.

[–] desktop_user@lemmy.blahaj.zone 0 points 4 hours ago

nobody, but local will continue to be an option (unless the government fucks up the laws)

[–] Crewman@sopuli.xyz 7 points 1 day ago (2 children)

You're not wrong, but isnt that also how Better Help works?

[–] bitjunkie@lemmy.world 2 points 1 day ago

The data isn't useful if the person no longer exists.