this post was submitted on 29 Jun 2025
455 points (96.0% liked)

Technology

71998 readers
4618 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

you are viewing a single comment's thread
view the rest of the comments
[–] poopkins@lemmy.world 12 points 15 hours ago (4 children)

Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.

And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.

Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.

It's is effectively self-treatment with more steps.

[–] whalebiologist@lemmy.world 5 points 9 hours ago (1 children)

LLM will not be able to raise alarm bells

this is like the "benefit" of what LLM-therapy would provide if it worked. The reality is that, it doesn't but it serves as a proof of concept that there is a need for anonymous therapy. Therapy in the USA is only for people with socially acceptable illnesses. People rightfully live in fear of getting labeled as untreatable, a danger to self and others, and then at best dropped from therapy and at worst institutionalized.

yep, almost nobody wants to be committed to a psych ward without consent

[–] TubularTittyFrog@lemmy.world 4 points 14 hours ago* (last edited 14 hours ago) (1 children)

It’s is effectively self-treatment with more steps.

And for many people it's better than nothing and likely the best they can do. Waiting lists for a basic therapist in my area are months long. Shorter if you pay out of pocket, but that isn't affordable to average people because it's like 300-400 for a one hour session.

[–] xorollo@leminal.space 1 points 12 hours ago (1 children)

I get it, but I'm not sure that "something is better than nothing" in this case. I don't judge any individual for using it, but the risks are huge, as others have documented. And the benefits are questionable.

[–] TubularTittyFrog@lemmy.world 3 points 11 hours ago

something is always better than nothing. esp if you are starving.

[–] rozodru@lemmy.world -1 points 10 hours ago

I can't find the story for the life of me right now but I'm pretty sure there was one a few months back where someone was talking with an LLM about their depression and suicide and the LLM essentially said "yeah you should probably do it." because to the LLM, that was the best solution to the problem.