AI is what cracked my egg shell, fucking wild...
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Well that's gotta be an interesting story! Don't leave us hanging!
Look, if you can afford therapy, really, fantastic for you. But the fact is, it's an extremely expensive luxury, even at poor quality, and sharing or unloading your mental strain with your friends or family, particularly when it is ongoing, is extremely taxing on relationships. Sure, your friends want to be there for you when they can, but it can put a major strain depending on how much support you need. If someone can alleviate that pressure and that stress even a little bit by talking to a machine, it's in extremely poor taste and shortsighted to shame them for it. Yes, they're willfully giving up their privacy, and yes, it's awful that they have to do that, but this isn't like sharing memes... in the hierarchy of needs, getting the pressure of those those pent up feelings out is important enough to possibly be worth the trade-off. Is it ideal? Absolutely not. Would it be better if these systems were anonymized? Absolutely. But humans are natural anthropomorphizers. They develop attachments and build relationships with inanimate objects all the time. And a really good therapist is more a reflection for you to work through things yourself anyway, mostly just guiding your thoughts towards better patterns of thinking. There's no reason the machine can't do that, and while it's not as good as a human, it's a HUGE improvement on average over nothing at all.
And it's awesome. Men aren't allowed by others to show weakness. AI therapy genuinely helps a lot.
Or it gets them into a negative feedback loop since AI hardly ever tries to contradict you.
But yeah. At least they're opening up to someone/something.
Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.
And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.
Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.
It's is effectively self-treatment with more steps.