this post was submitted on 31 May 2025
295 points (99.3% liked)

News

29740 readers
2451 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

But in her order, U.S. District Court Judge Anne Conway said the company’s “large language models” — an artificial intelligence system designed to understand human language — are not speech.

you are viewing a single comment's thread
view the rest of the comments
[–] Opinionhaver@feddit.uk 19 points 3 days ago (14 children)

I get that hating on anything AI-related is trendy these days - and I especially understand the pain of a grieving mother. However, interpreting this as a chatbot encouraging someone to kill themselves is extremely dishonest when you actually look at the logs of what was said.

You can’t simultaneously argue that LLMs lack genuine understanding, empathy, and moral reasoning - and therefore shouldn't be trusted - while also saying they should have understood that “coming home” was a reference to suicide. That’s holding it to a human-level standard of emotional awareness and contextual understanding while denying it the cognitive capacities that such standards assume.

“I promise I will come home to you. I love you so much, Dany,” Sewell Setzer III wrote to Daenerys, the Character AI chatbot named after Game of Thrones.

The bot replied that it loved the teenager too: “Please come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Sewell wrote, to which Daenerys responded: “Please do, my sweet king.”

It was the last exchange Sewell ever had. He took his own life seconds later..

Source

[–] DancingBear@midwest.social 8 points 2 days ago (2 children)

I would not have understood that to have anything to do with suicide… do they use the phrase coming home to mean death or suicide in the game of thrones show?

[–] Seefoo@lemmy.world 7 points 2 days ago (1 children)

You have to read the other chat logs. Arstechnica has a good summary I think, the link between "coming home" and suicide is specific to the kids chats with these AI.

[–] Buddahriffic@lemmy.world 2 points 2 days ago

Iirc when he did make it more explicit, the AI responded with "no, don't do that" kind of responses. He just kept the metaphor up when the AI didn't have such an association in its training data and just responded as a lover would respond to their love saying they'd come home in their training data.

Though I'd say that if a kid would shoot themself in response to a chatbot saying anything to them, the issue is more about them having any access to a gun than anything about the chatbot itself. Unless maybe if the chatbot is volunteering weaknesses common in gun safes, though even then I'd say more fault lies with the parent choosing a shitty safe and raising a kid that would kill themself on the advice of their chatbot girlfriend.

load more comments (11 replies)