this post was submitted on 21 Mar 2025
1309 points (99.4% liked)

Technology

67151 readers
3976 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] XeroxCool@lemmy.world 24 points 1 day ago (2 children)

Will this further fuck up the inaccurate nature of AI results? While I'm rooting against shitty AI usage, the general population is still trusting it and making results worse will, most likely, make people believe even more wrong stuff.

[–] ladel@feddit.uk 32 points 1 day ago* (last edited 1 day ago) (4 children)

The article says it's not poisoning the AI data, only providing valid facts. The scraper still gets content, just not the content it was aiming for.

E:

It is important to us that we don’t generate inaccurate content that contributes to the spread of misinformation on the Internet, so the content we generate is real and related to scientific facts, just not relevant or proprietary to the site being crawled.

[–] XeroxCool@lemmy.world 1 points 12 hours ago

Thank you for catching that. Even reading through again, I couldn't find it while skimming. With the mention of X2 and RSS, I assumed that paragraph would just be more technical description outside my knowledge. Instead, what I did hone in on was

"No real human would go four links deep into a maze of AI-generated nonsense."

Leading me to be pessimistic.

[–] melpomenesclevage@lemmy.dbzer0.com 3 points 1 day ago (1 children)

if you're dumb enough to trust a large language model because someone told you "iTs Ai!" no amount of facts will be of great utility to you.

[–] XeroxCool@lemmy.world 1 points 12 hours ago (1 children)

That take would be more digest able if I wasn't stuck on the same planet as those people.

im saying they want to be lied to. it would be disrespectful to offer them the truth.

and the data for the LLM is now salted with procedural garbage. it's great!

[–] ObsidianZed@lemmy.world 5 points 1 day ago

Until the AI generating the content starts hallucinating.

[–] melpomenesclevage@lemmy.dbzer0.com 14 points 1 day ago (1 children)

If you're dumb enough and care little enough about the truth, I'm not really going to try coming at you with rationality and sense. I'm down to do an accelerationism here. fuck it. burn it down.

remember; these companies all run at a loss. if we can hold them off for a while, they'll stop getting so much investment.

[–] einlander@lemmy.world 1 points 1 day ago (4 children)

The problem I see with poisoning the data is the AI's being trained for law enforcement hallucinating false facts used to arrest and convict people.

[–] patatahooligan@lemmy.world 10 points 1 day ago

Law enforcement AI is a terrible idea and it doesn't matter whether you feed it "false facts" or not. There's enough bias in law enforcement that the data is essentially always poisoned.

that's the entire point of laws, though, and it was already being used for that.

giving the laws better law stuff will not improve them. the law is malevolent. you cannot fix it by offering to help.

[–] limonfiesta@lemmy.world 1 points 1 day ago

They aren't poisoning the data with disinformation.

They're poisoning it with accurate, but irrelevant information.

For example, if a bot is crawling sites relating to computer programming, or weather, this tool might lure the crawler into pages related to animal facts, or human biology.

[–] sugar_in_your_tea@sh.itjust.works 0 points 1 day ago (1 children)

Law enforcement doesn't convict anyone, that's a judge's job. If a LEO falsely arrests you, you can sue them, and it should be pretty open-and-shut if it's due to AI hallucination. Enough of that and LEO will stop it.

[–] Jarix@lemmy.world 1 points 6 hours ago

More likely they will remove your ability to sue them if you are talking about the usa and many other countries