this post was submitted on 15 Feb 2025
-66 points (17.0% liked)

Technology

63134 readers
3408 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] jubilationtcornpone@sh.itjust.works 59 points 1 week ago* (last edited 1 week ago) (3 children)

That is one bullshit headline. Forbes keeping the AI pump and dump scheme going.

TLDR: People correctly discerned that written responses were from an "AI" chatbot slightly less often than they correctly discerned that responses were from a psychotherapist.

"AI" cannot replace a therapist and hasn't "won" squat.

[–] Goun@lemmy.ml 8 points 1 week ago

Holy shit, thanks for saving us the click. Wtf

[–] paradox2011@lemmy.ml 2 points 1 week ago

You're doing the lord's work 🫡

[–] asap@lemmy.world -5 points 1 week ago* (last edited 1 week ago) (1 children)

A bit disingenuous not to mention this part:

Further, participants in most cases preferred ChatGPT’s take on the matter at hand. That was based on five factors: whether the response understood the speaker, showed empathy, was appropriate for the therapy setting, was relevant for various cultural backgrounds, and was something a good therapist would say.

[–] PapstJL4U@lemmy.world 11 points 1 week ago (1 children)

Patients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don't think.

[–] jubilationtcornpone@sh.itjust.works 4 points 1 week ago (2 children)

Exactly. AI chatbot's also cannot empathize since they have no self awareness.

[–] asap@lemmy.world -3 points 1 week ago* (last edited 1 week ago)

You can't say "Exactly" when you tl;dr'd and removed one of the most important parts of the article.

Your human summary was literally worse than AI 🤦

I'm getting downvoted, which makes me suspect people think I'm cheerleading for AI. I'm not. I'm sure it sucks compared to a therapist. I'm just saying that the tl;dr also sucked.

[–] desktop_user@lemmy.blahaj.zone -3 points 1 week ago

but it can give the illusion of empathy, which is far more important.

[–] potentiallynotfelix@lemmy.fish 16 points 1 week ago

I don't trust therapists, but I trust OpenAI with my mental state about 100x less.

[–] ChaoticNeutralCzech@feddit.org 8 points 1 week ago (1 children)

Imitating a therapist is not too hard, look up Eliza. Was it good as an actual therapist? Haha, no.

[–] nyan@lemmy.cafe 6 points 1 week ago

Did they compete on providing actual therapy? No? Then this is meaningless.

[–] MagicShel@lemmy.zip 4 points 1 week ago* (last edited 1 week ago)

I've used AI as a pseudo-therapist. It was kinda surreal. It had some helpful things to say, but there was a whole lot of cheerleading. Like, I appreciate the boost, and telling me how great I am. Then it kept trying to push me into an action plan like it's selling a Tony Robbin's book. And it never really challenged me on my representations or perspective except when I was down on myself.

I get it, when someone comes to you with troubles, try to make them feel better about themselves. But I really have to do a lot of searching to figure out what parts are worth paying attention to and what parts are just hyping me up.

I definitely would not trust it, but I think it says some useful stuff by accident now and again.

Maybe it would've done better if I'd given it really detailed instructions on how to be a therapist, but if I could do that I could probably give those same instructions to my wife or someone and be better off.

[–] cyrano@lemmy.dbzer0.com 2 points 1 week ago

From the study

Using different measures, we then confirmed that responses written by ChatGPT were rated higher than the therapist’s responses suggesting these differences may be explained by part-of-speech and response sentiment. This may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes. We anticipate that this work may lead to the development of different methods of testing and creating psychotherapeutic interventions. Further, we discuss limitations (including the lack of the therapeutic context), and how continued research in this area may lead to improved efficacy of psychotherapeutic interventions allowing such interventions to be placed in the hands of individuals who need them the most.

[–] latenightnoir@lemmy.world 0 points 1 week ago