this post was submitted on 03 Jun 2025
390 points (98.3% liked)
Technology
70916 readers
3587 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Turns out it doesn’t really matter what the medium is, people will abuse it if they don’t have a stable mental foundation. I’m not shocked at all that a person who would believe a flat earth shitpost would also believe AI hallucinations.
I dunno, I think there's credence to considering it as a worry.
Like with an addictive substance: yeah, some people are going to be dangerously susceptible to it, but that doesn't mean there shouldn't be any protections in place...
Now what the protections would be, I've got no clue. But I think a blanket, "They'd fall into psychosis anyway" is a little reductive.
I don’t think I suggested it wasn’t worrisome, just that it’s expected.
If you think about it, AI is tuned using RLHF, or Reinforcement Learning from Human Feedback. That means the only thing AI is optimizing for is “convincingness”. It doesn’t optimize for intelligence, anything seems like intelligence is literally just a side effect as it forever marches onward towards becoming convincing to humans.
“Hey, I’ve seen this one before!” You might say. Indeed, this is exactly what happened to social media. They optimized for “engagement”, not truth, and now it’s eroding the minds of lots of people everywhere. AI will do the same thing if run by corporations in search of profits.
Left unchecked, it’s entirely possible that AI will become the most addictive, seductive technology in history.
Ah, I see what you're saying -- that's a great point. It's designed to be entrancing AND designed to actively try to be more entrancing.