this post was submitted on 08 Jun 2025
13 points (100.0% liked)

TechTakes

1927 readers
79 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] blakestacey@awful.systems 23 points 1 day ago (6 children)

Bringing over aio's comment from the end of last week's stubsack:

This week the WikiMedia Foundation tried to gather support for adding LLM summaries to the top of every Wikipedia article. The proposal was overwhelmingly rejected by the community, but the WMF hasn't gotten the message, saying that the project has been "paused". It sounds like they plan to push it through regardless.

Way down in the linked wall o' text, there's a comment by "Chaotic Enby" that struck me:

Another summary I just checked, which caused me a lot more worries than simple inaccuracies: Cambrian. The last sentence of that summary is "The Cambrian ended with creatures like myriapods and arachnids starting to live on land, along with early plants.", which already sounds weird: we don't have any fossils of land arthropods in the Cambrian, and, while there has been a hypothesis that myriapods might have emerged in the Late Cambrian, I haven't heard anything similar being proposed about arachnids. But that's not the worrying part.

No, the issue is that nowhere in the entire Cambrian article are myriapods or arachnids mentioned at all. Only one sentence in the entire article relates to that hypothesis: "Molecular clock estimates have also led some authors to suggest that arthropods colonised land during the Cambrian, but again the earliest physical evidence of this is during the following Ordovician". This might indicate that the model is relying on its own internal knowledge, and not just on the contents of the article itself, to generate an "AI overview" of the topic instead.

Further down the thread, there's a comment by "Gnomingstuff" that looks worth saving:

There was an 8-person community feedback study done before this (a UI/UX text using the original Dopamine summary), and the results are depressing as hell. The reason this was being pushed to prod sure seems to be the cheerleading coming from 7 out of those 8 people: "Humans can lie but AI is unbiased," "I trust AI 100%," etc.

Perhaps the most depressing is this quote -- "This also suggests that people who are technically and linguistically hyper-literate like most of our editors, internet pundits, and WMF staff will like the feature the least. The feature isn't really "for" them" -- since it seems very much like an invitation to ignore all of us, and to dismiss any negative media coverage that may ensue (the demeaning "internet pundits").

Sorry for all the bricks of text here, this is just so astonishingly awful on all levels and everything that I find seems to be worse than the last.

Another comment by "CMD" evaluates the summary of the dopamine article mentioned there:

The first sentence is in the article. However, the second sentence mentions "emotion", a word that while in a couple of reference titles isn't in the article at all. The third sentence says "creating a sense of pleasure", but the article says "In popular culture and media, dopamine is often portrayed as the main chemical of pleasure, but the current opinion in pharmacology is that dopamine instead confers motivational salience", a contradiction. "This neurotransmitter also helps us focus and stay motivated by influencing our behavior and thoughts". Where is this even from? Focus isn't mentioned in the article at all, nor is influencing thoughts. As for the final sentence, depression is mentioned a single time in the article in what is almost an extended aside, and any summary would surely have picked some of the examples of disorders prominent enough to be actually in the lead.

So that's one of five sentences supported by the article. Perhaps the AI is hallucinating, or perhaps it's drawing from other sources like any widespread llm. What it definitely doesn't seem to be doing is taking existing article text and simplifying it.

[–] YourNetworkIsHaunted@awful.systems 15 points 23 hours ago (2 children)

The thing that galls me here even more than other slop is that there isn't even some kind of horrible capitalist logic underneath it. Like, what value is this supposed to create? Replacing the leads written by actual editors, who work for free? You already have free labor doing a better job than this, why would you compromise the product for the opportunity to spend money on compute for these LLM not-even-actually-summaries? Pure brainrot.

[–] nightsky@awful.systems 6 points 18 hours ago (1 children)

Maybe someone has put into their heads that they have to "go with the times", because AI is "inevitable" and "here to stay". And if they don't adapt, AI would obsolete them. That Wikipedia would become irrelevant because their leadership was hostile to "progress" and rejected "emerging technology", just like Wikipedia obsoleted most of the old print encyclopedia vendors. And one day they would be blamed for it, because they were stuck in the past at a crucial moment. But if they adopt AI now, they might imagine, one day they will be praised as the visionaries who carried Wikipedia over to the next golden age of technology.

Of course all of that is complete bullshit. But instilling those fears ("use it now, or you will be left behind!") is a big part of the AI marketing messaging which is blasted everywhere non-stop. So I wouldn't be surprised if those are the brainworms in their heads.

That's probably true, but it also speaks to Ed Zitron's latest piece about the rise of the Business Idiot. You can explain why Wikipedia disrupted previous encyclopedia providers in very specific terms: crowdsourced production to volunteer editors cuts costs massively and allows the product to be delivered free (which also increases the pool of possible editors and improves quality), and the strict* adherence to community standards and sourcing guidelines prevents the worse loss of truth and credibility that you may expect.

But there is no such story that I can find for how Wikipedia gets disrupted by Gen AI. At worst it becomes a tool in the editor's belt, but the fundamental economics and structure just aren't impacted. But if you're a business idiot then you can't actually explain it either way and so of course it seems plausible

[–] o7___o7@awful.systems 12 points 22 hours ago* (last edited 13 hours ago)

Some AI company waving a big donation outside of the spotlight? Dorks trying to burnish their resumes?

Ya gotta think it's going to lead to a rebellion.

load more comments (3 replies)