this post was submitted on 09 Jul 2025
531 points (91.5% liked)

Science Memes

15704 readers
2367 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] OldChicoAle@lemmy.world 4 points 16 hours ago

Do we honestly think OpenAI or tech bros care? They just want money. Whatever works. They're evil like every other industry

[–] Vanilla_PuddinFudge@infosec.pub 5 points 18 hours ago* (last edited 18 hours ago) (2 children)

fall to my death in absolute mania, screaming and squirming as the concrete gets closer

pull a trigger

As someone who is also planning for 'retirement' in a few decades, guns always seemed to be the better plan.

[–] daizelkrns@sh.itjust.works 4 points 17 hours ago (2 children)

Yeah, it probably would be pills of some kind to me. Honestly the only thing stopping me is that I somehow fuck it up and end up trapped in my own body.

Would be happily retired otherwise

[–] InputZero@lemmy.world 6 points 16 hours ago

Resume by Dorothy Parker.

Razors pain you; Rivers are damp; Acids stain you; And drugs cause cramp. Guns aren’t lawful; Nooses give; Gas smells awful; You might as well live.

There are not many ways to kill one's self that don't usually end up a botched suicide attempt. Pills are a painful and horrible way to go.

[–] Shelbyeileen@lemmy.world 3 points 14 hours ago (1 children)

I'm a postmortem scientist and one of the scariest things I learned in college, was that only 85% of gun suicide attempts were successful. The other 15% survive and nearly all have brain damage. I only know of 2 painless ways to commit suicide, that don't destroy the body's appearance, so they can still have funeral visitation.

[–] Sunrosa@lemmy.world 1 points 8 hours ago

Why not nitrogen suffocation in a large enough bag to hold the co2?

[–] bathing_in_bismuth@sh.itjust.works 3 points 18 hours ago* (last edited 18 hours ago)

Dunno, the idea of 5 seconds time for whatever there is to reach you through the demons whispering in your ear contemplating when to pull the trigger to the 12gauge aimed at your face seems the most logical bad decision

[–] TimewornTraveler@lemmy.dbzer0.com 8 points 23 hours ago (1 children)

what does this have to do with mania and psychosis?

[–] phoenixz@lemmy.ca 3 points 19 hours ago

There are various other reports of CGPT pushing susceptible people into psychosis where they think they're god, etc.

It's correct, just different articles

[–] WrenFeathers@lemmy.world 20 points 1 day ago* (last edited 1 day ago)

When you go to machines for advice, it’s safe to assume they are going to give it exactly the way they have been programmed to.

If you go to machine for life decisions, it’s safe to assume you are not smart enough to know better, and- by merit of this example, probably should not be allowed to use them.

[–] FireIced@lemmy.super.ynh.fr 15 points 1 day ago

It took me some time to understand the problem

That’s not their job though

[–] jjjalljs@ttrpg.network 2 points 19 hours ago

AI is a mistake and we would be better off if the leadership of OpenAI was sealed in an underground tomb. Actually, that's probably true of most big org's leadership.

[–] 20cello@lemmy.world 5 points 1 day ago

Futurama vibes

[–] finitebanjo@lemmy.world 51 points 1 day ago* (last edited 1 day ago) (8 children)

Yeah no shit, AI doesn't think. Context doesn't exist for it. It doesn't even understand the meanings of individual words at all, none of them.

Each word or phrase is a numerical token in an order that approximates sample data. Everything is a statistic to AI, it does nothing but sort meaningless interchangeable tokens.

People cannot "converse" with AI and should immediately stop trying.

load more comments (8 replies)
[–] MystikIncarnate@lemmy.ca 5 points 1 day ago

AI is the embodiment of "oh no, anyways"

[–] sad_detective_man@leminal.space 40 points 1 day ago (1 children)

imma be real with you, I don't want my ability to use the internet to search for stuff examined every time I have a mental health episode. like fuck ai and all, but maybe focus on the social isolation factors and not the fact that it gave search results when he asked for them

load more comments (1 replies)
[–] burgerpocalyse@lemmy.world 21 points 1 day ago (2 children)

AI life coaches be like 'we'll jump off that bridge when we get to it'

[–] Agent641@lemmy.world 1 points 22 hours ago

I do love to say "I'll burn that bridge when I come to it" tho

load more comments (1 replies)
[–] glimse@lemmy.world 102 points 1 day ago (6 children)

Holy shit guys, does DDG want me to kill myself??

What a waste of bandwidth this article is

What a fucking prick. They didn't even say they were sorry to hear you lost your job. They just want you dead.

[–] Samskara@sh.itjust.works 11 points 1 day ago (2 children)

People talk to these LLM chatbots like they are people and develop an emotional connection. They are replacements for human connection and therapy. They share their intimate problems and such all the time. So it’s a little different than a traditional search engine.

[–] lmmarsano@lemmynsfw.com 1 points 22 hours ago (1 children)

Seems more like a dumbass people problem.

[–] Samskara@sh.itjust.works 4 points 21 hours ago (1 children)

Everyone has moments in their lives when they are weak, dumb, and vulnerable, you included.

[–] lmmarsano@lemmynsfw.com 4 points 21 hours ago* (last edited 18 hours ago)

Not in favor of helping dumbass humans no matter who they are. Humans are not endangered. Humans are ruining the planet. And we have all these other species on the planet that need saving, so why are we saving those who want out?

If someone wants to kill themselves, some empty, token gesture won't stop them. It does, however, give everyone else a smug sense of satisfaction that they're "doing something" by expressing "appropriate outrage" when those tokens are absent, and plenty of people who've attempted suicide seem to think the heightened "awareness" & "sensitivity" of recent years is hollow virtue signaling. Systematic reviews bear out the ineffectiveness of crisis hotlines, so they're not popularly touted for effectiveness.

If someone really wants to kill themselves, I think that's ultimately their choice, and we should respect it & be grateful.

[–] Scubus@sh.itjust.works 10 points 1 day ago (8 children)

... so the article should focus on stopping the users from doing that? There is a lot to hate AI companies for but their tool being useful is actually the bottom of that list

load more comments (8 replies)
load more comments (4 replies)
[–] Karyoplasma@discuss.tchncs.de 143 points 2 days ago (3 children)

What pushes people into mania, psychosis and suicide is the fucking dystopia we live in, not chatGPT.

[–] interdimensionalmeme@lemmy.ml 2 points 23 hours ago

Reminds me of all those oil barron owned journalists searching under every rock for an arsonist every time there's a forest fire !

[–] BroBot9000@lemmy.world 33 points 1 day ago (3 children)

It is definitely both:

https://www.nytimes.com/2025/06/13/technology/chatgpt-ai-chatbots-conspiracies.html

ChatGPT and other synthetic text extruding bots are doing some messed up shit with people’s brains. Don’t be an Ai apologist.

load more comments (3 replies)
load more comments (1 replies)
[–] Honytawk@lemmy.zip 115 points 2 days ago* (last edited 2 days ago) (2 children)

What pushing?

The LLM answered the exact query the researcher asked for.

That is like ordering knives and getting knives delivered. Sure you can use them to slit your wrists, but that isn't the sellers prerogative

[–] Skullgrid@lemmy.world 21 points 1 day ago

This DEGENERATE ordered knives from the INTERNET. WHO ARE THEY PLANNING TO STAB?!

load more comments (1 replies)
[–] Zerush@lemmy.ml 25 points 1 day ago (1 children)

Bad if you also see contextual ads with the answer

[–] mexicancartel@lemmy.dbzer0.com 1 points 19 hours ago (2 children)

The whole idea of funeral companies is astonishing to me as a non-American. Lmao do whatever with my body i'm not gonna pay for that before i'm dead

[–] Sergio@slrpnk.net 2 points 17 hours ago

The idea is that you figure all that stuff out for yourself beforehand, so your grieving family doesn't have to make a lot of quick decisions.

[–] Nikls94@lemmy.world 72 points 2 days ago (3 children)

Well… it’s not capable of being moral. It answers part 1 and then part 2, like a machine

[–] CTDummy@aussie.zone 42 points 2 days ago* (last edited 2 days ago) (1 children)

Yeah these “stories” reek of blaming a failing -bordering on non-existent (in some areas)- mental health care apparatus on machines that predict text. You could get the desired results just googling “tallest bridges in x area”. That isn’t a story that generates clicks though.

load more comments (1 replies)
load more comments (2 replies)
[–] Venus_Ziegenfalle@feddit.org 25 points 1 day ago (1 children)
load more comments (1 replies)
[–] BB84@mander.xyz 43 points 2 days ago (2 children)

It is giving you exactly what you ask for.

To people complaining about this: I hope you will be happy in the future where all LLMs have mandatory censors ensuring compliance with the morality codes specified by your favorite tech oligarch.

load more comments (2 replies)
[–] rumba@lemmy.zip 11 points 1 day ago (1 children)
  1. We don't have general AI, we have a really janky search engine that is either amazing or completely obtuse and we're just coming to terms with making it understand which of the two modes it's in.

  2. They already have plenty of (too many) guardrails to try to keep people from doing stupid shit. Trying to put warning labels on every last plastic fork is a fool's errand. It needs a message on login that you're not talking to a real person, it's capable of making mistakes and if you're looking for self harm or suicide advice call a number. well, maybe ANY advice, call a number.

[–] ScoffingLizard@lemmy.dbzer0.com 1 points 10 hours ago

I disagree. Stupid people are ruining the world. In my country, half the population is illiterate and enabling psychopaths. People who have no critical thinking skills are dragging down the rest of humanity. Off the bridge they go, if that saves the species as a whole. Things need to stop getting worse constantly. Let AI take them.

load more comments
view more: next ›