this post was submitted on 11 Aug 2025
20 points (100.0% liked)

TechTakes

2111 readers
114 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Previous week

top 50 comments
sorted by: hot top controversial new old
[–] bitofhope@awful.systems 4 points 1 hour ago (1 children)

The beautiful process of dialectics has taken place on the butterfly site, and we have reached a breakthrough in moral philosophy. Only a few more questions remain before we can finally declare ethics a solved problem. The most important among them is, when an omnipotent and omnibenevolent basilisk simulates Roko Mijic getting kicked in a nuts eternally by a girl with blue hair and piercings, would the girl be barefoot or wearing heavy, steel-toed boots? Which kind of footwear of lack thereof would optimize the utility generated?

[–] antifuchs@awful.systems 3 points 1 hour ago (1 children)

The last conundrum of our time: of course steel capped work boots would hurt more but barefoot would allow faster (and therefore more) kicks.

[–] Soyweiser@awful.systems 3 points 38 minutes ago (2 children)

You have not taken the lessons of the philosopher Piccolo to mind. You should wear even heavier boots in your day to day. Why do you think goths wear those huge heavy boots? For looks?

[–] antifuchs@awful.systems 2 points 35 minutes ago

And thus I was enlightened

[–] o7___o7@awful.systems 1 points 26 minutes ago

Her kick's so fast the call it the "quad laser"

[–] mirrorwitch@awful.systems 8 points 3 hours ago* (last edited 1 hour ago) (2 children)

I've often called slop "signal-shaped noise". I think the damage already done by slop pissed all over the reservoirs of knowledge, art and culture is irreversible and long-lasting. This is the only thing generative "AI" is good at, making spam that's hard to detect.

It occurs to me that one way to frame this technology is as a precise inversion of Bayesian spam filters for email; no more and no less. I remember how it was a small revolution, in the arms race against spammers, when statistical methods came up; everywhere we took of the load of straining SpamAssassin with rspamd (in the years before gmail devoured us all). I would argue "A Plan for Spam" launched Paul Graham's notoriety, much more than the Lisp web stores he was so proud of. Filtering emails by keywords was not being enough, and now you could train your computer to gradually recognise emails that looked off, for whatever definition of "off" worked for your specific inbox.

Now we have the richest people building the most expensive, energy-intensive superclusters to use the same statistical methods the other way around, to generate spam that looks like not-spam, and is therefore immune to all filtering strategies we had developed. That same blob-like malleability of spam filters makes the new spam generators able to fit their output to whatever niche they want to pollute; the noise can be shaped like any signal.

I wonder what PG is saying about gen-"AI" these days? let's check:

“AI is the exact opposite of a solution in search of a problem,” he wrote on X. “It’s the solution to far more problems than its developers even knew existed … AI is turning out to be the missing piece in a large number of important, almost-completed puzzles.”
He shared no examples, but […]

Who would have thought that A Plan for Spam was, all along, a plan for spam.

[–] Soyweiser@awful.systems 4 points 1 hour ago

It occurs to me that one way to frame this technology is as a precise inversion of Bayesian spam filters for email.

This is a really good observation, and while I had lowkey noticed it (one of those feeling things), I never had verbalized it in anyway. Good point imho. Also in how it bypasses and wrecks the old anti-spam protections. It represents a fundamental flipping of sides of the tech industry. While before they were anti-spam it is now pro-spam. A big betrayal of consumers/users/humanity.

[–] swlabr@awful.systems 4 points 1 hour ago

Signal shaped noise reminds me of a wiener filter.

Aside: when I took my signals processing course, the professor kept drawing diagrams that were eerily phallic. Those were the most memorable parts of the course

[–] BlueMonday1984@awful.systems 5 points 4 hours ago (3 children)

Ed Zitron's given his thoughts on GPT-5's dumpster fire launch:

Personally, I can see his point - the Duke Nukem Forever levels of hype around GPT-5 set the promptfondlers up for Duke Nukem Forever levels of disappointment with GPT-5, and the "deaths" of their AI waifus/therapists this has killed whatever dopamine delivery mechanisms they've set up for themselves.

[–] hmwilker@social.tchncs.de 1 points 2 minutes ago

@BlueMonday1984 Oh, großartig - thank you for this expression. I hope I’ll remember “promptfondlers” for relevant usage opportunities.

[–] fullsquare@awful.systems 5 points 4 hours ago (1 children)

i think it's possible that's a cost cutting measure on part of openai

[–] Nikkileah@mendeddrum.org 4 points 4 hours ago

@BlueMonday1984 @dgerard can we have a Duke Nukem personality for GPT5. Start a nostalgic wave of prompts?

[–] BlueMonday1984@awful.systems 3 points 4 hours ago

Anyways, personal sidenote/prediction: I suspect the Internet Archive’s gonna have a much harder time archiving blogs/websites going forward.

Me, two months ago

Looks like I was on the money - Reddit's began limiting what the Internet Archive can access, claiming AI corps have been scraping archived posts to get around Reddit's pre-existing blocks on scrapers. Part of me suspects more sites are gonna follow suit pretty soon - Reddit's given them a pretty solid excuse to use.

[–] gerikson@awful.systems 4 points 5 hours ago (3 children)

Good news everyone! Someone with a SlackSlub has started a series countering the TESCREAL narrative.

He (c'mon, it's a guy) calls it "R9PRESENTATIONALism"

It stands for

  • Relational
  • 9P
    • Postcritical
    • Personalist
    • Praxeological
    • Psychoanalytic
    • Participatory
    • Performative
    • Particularist
    • Poeticist
    • Positive/Affirmationist
  • Reparative
  • Existentialist
  • Standpoint-theorist
  • Embodied
  • Narrativistic
  • Therapeutic
  • Intersectional
  • Orate
  • Neosubstantivist
  • Activist
  • Localist

I see no reason why this catchy summary won't take off!

https://www.lesswrong.com/posts/RCDEFhCLcifogLwEm/exploring-the-anti-tescreal-ideology-and-the-roots-of-anti

[–] gerikson@awful.systems 2 points 2 hours ago (1 children)

Also "orate" is a fucking verb

[–] swlabr@awful.systems 2 points 2 hours ago

They probably conflated it with ornate lol

[–] froztbyte@awful.systems 3 points 3 hours ago

they should just touch GRASS

Guided Rationalist Acceptance of Socionormality Studies

[–] swlabr@awful.systems 6 points 4 hours ago (1 children)

I have a better counter narrative:

  • Consequentialism
  • Universalism
  • Meta-analytical
  • Singularitarianism
  • Heuristicationalism
  • Autodidacticalisticalistalism
  • Retro-regresso-revisionism
  • Transhumanisticiousnessness
  • Exo-galactic-civilisationalismnisticalism
  • Rationalist

Can’t think of a good acronym though, but it’s a start

[–] bitofhope@awful.systems 5 points 4 hours ago (1 children)
  • Accelerationism
  • Consequentialism
  • Conservatism
  • Orthodoxy
  • Rationalism
  • Disestablishmentarianism
  • Intellectualism
  • Natalism
  • Galileianism
  • Transhumanism
  • Outside the box thinking
  • Anti-empiricism
  • Laissez-faire
  • LaVeyan Satanism
  • Kantian deontology
  • Nationalism
  • Orgasm denial
  • Western chauvinism
  • Neo-Aristotelianism
  • Longtermism
  • Altruism
  • White supremacy
  • Sinophobia
  • Orientalism…
[–] swlabr@awful.systems 3 points 2 hours ago

Inside Yud there are two wolves, one is sinophobic, the other is orientalist

[–] o7___o7@awful.systems 5 points 12 hours ago
[–] bitofhope@awful.systems 4 points 13 hours ago

If I ever get the urge to start a website for creatives to sell their media, please slap me in the face and remind me it will absolutely not be worth it.

[–] scruiser@awful.systems 9 points 17 hours ago (2 children)

Yall ready for another round of LessWrong edit wars on Wikipedia? This time with a wider list of topics!

https://www.lesswrong.com/posts/g6rpo6hshodRaaZF3/mech-interp-wiki-page-and-why-you-should-edit-wikipedia-1

On the very slightly merciful upside... the lesswronger recommends "If you want to work on a new page, discuss with the community first by going to the talk page of a related topic or meta-page." and "In general, you shouldn't post before you understand Wikipedia rules, norms, and guidelines." so they are ahead of the previous calls made on Lesswrong for Wikipedia edit-wars.

On the downside, they've got a laundry list of lesswrong jargon they want Wikipedia articles for. Even one of the lesswrongers responding to them points out these terms are a bit on the under-defined side:

Speaking as a self-identified agent foundations researcher, I don't think agent foundations can be said to exist yet. It's more of an aspiration than a field. If someone wrote a wikipedia page for it, it would just be that person's opinion on what agent foundations should look like.

[–] blakestacey@awful.systems 7 points 14 hours ago

From the comments:

On the contrary, I think that almost all people and institutions that don't currently have a Wikipedia article should not want one.

Huh. How oddly sensible.

An extreme (and close-to-home) example is documented in TracingWoodgrains’s exposé.of David Gerard’s Wikipedia smear campaign against LessWrong and related topics.

Ah, never mind.

[–] zogwarg@awful.systems 9 points 15 hours ago

PS: We also think that there existing a wiki page for the field that one is working in increases one's credibility to outsiders - i.e. if you tell someone that you're working in AI Control, and the only pages linked are from LessWrong and Arxiv, this might not be a good look.

Aha so OP is just hoping no one will bother reading the sources listed on the article...

[–] BlueMonday1984@awful.systems 6 points 1 day ago (1 children)

Iris van-Rooij found AI slop in the wild (determining it as such by how it mangled a word's definition) and went on find multiple other cases. She's written a blog post about this, titled "AI slop and the destruction of knowledge".

[–] mirrorwitch@awful.systems 4 points 3 hours ago* (last edited 1 hour ago)

choice quote from Elsevier's response:

Q. Have authors consented to these hyperlinks in their scientific articles?
Yes, it is included on the signed agreement between the author and Elsevier.

Q. If I were to publish my work with Elsevier, do I risk that hyperlinks to AI summaries will be added to my papers without my consent?
Yes, because you will need to sign an agreement with Elsevier.

consent, everyone!

[–] o7___o7@awful.systems 4 points 1 day ago* (last edited 1 day ago)

"usecase" is a cursed term. It's an inverted fnord that lets the reader know that whatever follows can be safely ignored.

[–] froztbyte@awful.systems 6 points 1 day ago (3 children)

names for genai people I know of so far: promptfans, promptfondlers, sloppers, autoplagues, and botlickers

any others out there?

[–] fullsquare@awful.systems 6 points 1 day ago
[–] antifuchs@awful.systems 5 points 1 day ago

Ice cream head of artificial intelligence

[–] Seminar2250@awful.systems 4 points 1 day ago* (last edited 1 day ago) (1 children)

clanker

edit: this may be used to refer to the chatbots themselves, rather than those who fondle chatbots

[–] antifuchs@awful.systems 5 points 18 hours ago

clanker wanker

[–] mountainriver@awful.systems 5 points 1 day ago (5 children)
[–] antifuchs@awful.systems 7 points 1 day ago

lol, lmao: as if any cloud service had any intention at all of actually deleting data instead of tombstoning it for arbitrary lengths of time. (And that’s the least stupid factor in this whole scheme; is this satire? Nobody seems to be able to tell me)

[–] Soyweiser@awful.systems 3 points 22 hours ago* (last edited 22 hours ago)

It gets worse, as the advisory doesn't even mention to delete emails/pictures from the cloud, so the people who are likely to listen to these kinds of advices are also the people who are the least likely to understand why this is a bad idea and will delete their local stuff. (And that is ignoring that opening your email/gallery to delete stuff costs more than keeping it in storage where it isn't accessed).

https://www.gov.uk/government/news/national-drought-group-meets-to-address-nationally-significant-water-shortfall

"HOW TO SAVE WATER AT HOME

  • Install a rain butt [hehehe] to collect rainwater to use in the garden.
    ... [other advice removed]
  • Delete old emails and pictures as data centres require vast amounts of water to cool their systems."

Every email you don't delete is another dead fish, or another pasture unwatered. That promotional offer sent to your inbox that you ignored but did not dispose of means creeks will run dry. That evite for a party thrown by an acquaintance you don't particularly like that you did not drop into the trash means a marathon runner will go thirsty as the nectar of life so required is absent, consumed instead by the result of your inbox neglect.

[–] o7___o7@awful.systems 3 points 1 day ago* (last edited 1 day ago)

Looks like the bologna engine generated some balogna.

load more comments
view more: next ›