this post was submitted on 07 Sep 2025
22 points (95.8% liked)

TechTakes

2183 readers
55 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] antifuchs@awful.systems 20 points 2 weeks ago (3 children)

Whichever one of you did https://alignmentalignment.ai/caaac/jobs, well done, and many lols.

CAAAC is an open, dynamic, inclusive environment, where all perspectives are welcomed as long as you believe AGI will annihilate all humans in the next six months.

Alright, I can pretend to believe that, go on…

We offer competitive salaries and generous benefits, including no performance management because we have no way to assess whether the work you do is at all useful.

Incredible. I hope I get the job!

load more comments (3 replies)
[–] wizardbeard@lemmy.dbzer0.com 17 points 3 weeks ago (4 children)

Some poor souls who arguably have their hearts in the right place definitely don't have their heads screwed on right, and are trying to do hunger strikes outside Google's AI offices and Anthropic's offices.

https://programming.dev/post/37056928 contains links to a few posts on X by the folks doing it.

Imagine being so worried about AGI that you thought it was worth starving yourself over.

Now imagine feeling that strongly about it and not stopping to ask why none of the ideologues who originally sounded the alarm bells about it have tried anything even remotely as drastic.

On top of all that, imagine being this worried about what Anthropic and Google are doing in the research of AI, hopefully being aware of Google's military contracts, and somehow thinking they give a singular shit if you kill yourself over this.

And... where's the people outside fucking OpenAI? Bets on this being some corporate shadowplay shit?

[–] YourNetworkIsHaunted@awful.systems 10 points 3 weeks ago (4 children)

I mean, I try not to go full conspiratorial everything-is-a-false-fllag, but the fact that the biggest AI company that has been explicitly trying to create AGI isn't getting the business here is incredibly suspect. On the other hand, though, it feels like anything that publicly leans into the fears of evil computer God would be a self-own when they're in the middle of trying to completely ditch the "for the good of humanity, not just immediate profits" part of their organization.

load more comments (4 replies)
load more comments (3 replies)
[–] Architeuthis@awful.systems 15 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

Apparently the hacker who publicized a copy of the no fly list was leaked an article containing Yarvin's home address, which she promptly posted on bluesky. Won't link because I don't think we've had the doxxing discussion but It's easily findable now.

I'm mostly posting this because the article featured this photo:

[–] froztbyte@awful.systems 10 points 3 weeks ago (1 children)

I was curious so I dug up the post and then checked property prices for the neighbourhood

$2.6~4.8m

being thiel's idea guy seems to pay pretty well

load more comments (1 replies)
load more comments (1 replies)
[–] Soyweiser@awful.systems 15 points 2 weeks ago* (last edited 2 weeks ago) (20 children)

Word of warning, if you are squeemish, Charlie Kirk got shot, and closeup videos of him getting shot are over social media and people have not put warnings up. Managed to avoid it myself, but careful with autoplaying gifs, and use this link https://bsky.app/profile/nycsouthpaw.bsky.social/post/3lyiw3vi3yc2c for instructions how to turn that off.

E: and it was a groyper, stop your bets ladies and gentlebots. The winner of the bet is, acausalrobotgod! Again! What a predictive streak. (E2: im just making a joke here on betting markets and how ACRG would win them all because acausal magic, don't bet on peoples lives and don't use betting markets).

E: while im editing this post with corrections, turns out that Kirk was a lot worse than I thought and he wasn't just a garden variety propagandist like a Shapiro/Crowder etc. (yeah I know they both also try to branch out into more media). He was doing a lot of shit work on the ground, and one of those guys who was really influential with younger people. Read that this might also explain the extreme rightwing reaction, as he was seen by some as a next big political player. To me, very online leftwinger from europe, he just was the worst joke from a line of other propagandists, but I was wrong on that.

[–] swlabr@awful.systems 15 points 2 weeks ago (2 children)

Thoughts and prayers, definitely no curses.

[–] Soyweiser@awful.systems 14 points 2 weeks ago (1 children)

I have spend the last few hours looking through all my social media posts making sure I delete any bad things I have said about witches.

[–] swlabr@awful.systems 10 points 2 weeks ago

The covens appreciate the overture.

[–] BigMuffN69@awful.systems 9 points 2 weeks ago

Hecate left no crumbs

[–] bitofhope@awful.systems 12 points 2 weeks ago (15 children)

Irrationally annoyed at yanks incorrecting each other how this kind of shot could only be pulled off by a trained expert sniper. The Behind the Bastards guy agrees, but the replies are still stuffed with examples.

I know from experience that even mediocre conscripts shooting a gun for the first time in their life usually manage to land hits in a one foot diameter circle from 150 metres with iron sights on an intermediate catridge rifle. It doesn't take an elite marksman to hit a sitting man from 200 yards away, especially with a scope. Even if nervous and high on adrenaline, an average hunter, target shooter or (ex) military type would be more likely than not to hit a target of that size at that distance, assuming otherwise decent conditions.

Hell, the factory sights on an M16 are supposed to be set for zero elevation at 250 metres and the effective range for most assault rifles and their semi auto civilian variants is around 300 metres. To say you need to be a trained sniper to make this shot is like saying you need to be a professional racing driver to do 80 mph on a highway.

If there's one thing you'd assume seppos know well, it's shooting firearms, but some people still can't help but spout dumb bullshit.

load more comments (15 replies)
load more comments (18 replies)
[–] EponymousBosh@awful.systems 15 points 3 weeks ago (4 children)
[–] BlueMonday1984@awful.systems 16 points 3 weeks ago (1 children)

I genuinely thought therapists were gonna avoid the psychosis-inducing suicide machine after seeing it cause psychosis and suicide. Clearly, I was being too optimistic.

load more comments (1 replies)
[–] zogwarg@awful.systems 10 points 3 weeks ago
The future is now, and it is awful. 
Would any still wonder why, I grow so ever mournful.
load more comments (2 replies)
[–] Seminar2250@awful.systems 14 points 3 weeks ago (3 children)

university where the professor physically threatened me and plagiarized my work called to ask if i was willing to teach a notoriously hard computer science class (that i have taught before to stellar evals as a phd student^[evals are bullshit for measuring how well students actually learn anything, but are great for measuring the stupid shit business idiots love, like whether students will keep paying tuition. also they can be used to explain the pitfalls of using likert scales carelessly, as business idiots do.]). but they had to tell me that i was their last choice because they couldn't find a full professor to teach it (since i didn't finish my phd there because of said abusive professor). on top of that, they offered me a measly $6,000 usd for the entire semester with no benefits, and i would have to pay $500 for parking.

should i just be done with academia? enrollment deadlines for the spring are approaching and i'm wondering if i should just find a "regular job", rather than finishing a PhD elsewhere, especially given the direction higher ed is going in the us.

[–] V0ldek@awful.systems 13 points 3 weeks ago (1 children)

Every time I learn one single thing about how academia works in the USA I want to commit unspeakable acts of violence

load more comments (1 replies)
load more comments (2 replies)
[–] blakestacey@awful.systems 14 points 2 weeks ago (10 children)

The Wall Street Journal came out with a story on "conspiracy physics", noting Eric Weinstein and Sabine Hossenfelder as examples. Sadly, one of their quoted voices of sanity is Scott Aaronson, baking-soda volcano of genocide apologism.

[–] BlueMonday1984@awful.systems 10 points 2 weeks ago (1 children)

Somehow, ~~Palpatine returned~~ Scott came off as a voice of reason

[–] blakestacey@awful.systems 13 points 2 weeks ago (1 children)

Behold the power of this fully selective quotation.

load more comments (1 replies)
load more comments (9 replies)
[–] blakestacey@awful.systems 14 points 2 weeks ago (1 children)
load more comments (1 replies)
[–] saucerwizard@awful.systems 13 points 2 weeks ago (9 children)
load more comments (9 replies)
[–] mawhrin@awful.systems 13 points 3 weeks ago (6 children)

simon willison, the self-styled reasonable ai researcher, finds it hilarious and a good use of money throwing $14000 at claude to create an useless programming language that doesn't work.

good man simon willison!

[–] gerikson@awful.systems 15 points 3 weeks ago (1 children)

I mean it's still just funny money seeing the creator works for some company that resells tokens from Claude, but very few people are stepping back to note the drastically reduced expectations of LLMs. A year ago, it would have been plausible to claim that a future LLM could design a language from scratch. Now we have a rancid mess of slop, and it's an "art project", and the fact it's ersatz internally coherent is treated as a great success.

Willison should just have let this go, because it's a ludicrous example of GenAI, but he just can't help himself defending this crap.

load more comments (1 replies)
[–] blakestacey@awful.systems 15 points 3 weeks ago

Good sneer from user andrewrk:

People are always saying things like, “surprisingly good” to describe LLM output, but that’s like when 5 year old stops scribbling on the walls and draws a “surprisingly good” picture of the house, family, and dog standing outside on a sunny day on some construction paper. That’s great, kiddo, let’s put your programming language right here on the fridge.

[–] istewart@awful.systems 14 points 3 weeks ago (1 children)

Top-tier from Willison himself:

The learning isn’t in studying the finished product, it’s in watching how it gets there.

Mate, if that's true, my years of Gentoo experience watching compiler commands fly past in the terminal means I'm a senior operating system architect.

[–] froztbyte@awful.systems 9 points 3 weeks ago (3 children)

which naturally leads us to: having to fix a portage overlay ~= “compiler engineer”

wonder what simonw’s total spend (direct and indirect) in this shit has been to date. maybe sunk cost fallacy is an unstated/un(der?)accounted part in his True Believer thing?

load more comments (3 replies)
[–] nightsky@awful.systems 11 points 3 weeks ago (6 children)

Sigh. Love how he claims it's worth it for "learning"...

We already have a thing for learning, it's called "books", and if you want to learn compiler basics, $14000 could buy you hundreds of copies of the dragon book.

load more comments (6 replies)
load more comments (2 replies)
[–] CinnasVerses@awful.systems 12 points 3 weeks ago* (last edited 3 weeks ago) (18 children)

When it started in ’06, this blog was near the center of the origin of a “rationalist” movement, wherein idealistic youths tried to adapt rational styles and methods. While these habits did often impress, and bond this community together, they alas came to trust that their leaders had in fact achieved unusual rationality, and on that basis embraced many contrarian but not especially rational conclusions of those leaders. - Robin Hanson, 2025

I hear that even though Yud started blogging on his site, and even though George Mason University type economics is trendy with EA and LessWrong, Hanson never identified himself with EA or LessWrong as movements. So this is like Gabriele D'Annunzio insisting he is a nationalist not a fascist, not Nicholas Taleb denouncing phrenology.

load more comments (18 replies)
[–] BlueMonday1984@awful.systems 11 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

GoToSocial recently put up a code of conduct that openly barred AI-"assisted" changes and fascist/capitalist involvement, prompting some concern trolling on the red site.

Got a promptfondler trying to paint basic human decency as ridiculous, and a Concerned Individual^tm^ who's pissed at GoToSocial refusing to become a Nazi bar.

load more comments (3 replies)
[–] mlen@awful.systems 10 points 3 weeks ago

Signal is finally close to releasing a cross platform backup system: https://signal.org/blog/introducing-secure-backups/

[–] BlueMonday1984@awful.systems 10 points 3 weeks ago (2 children)

New Loser Lanyard (ironically called the Friend) just dropped, a "chatbot-enabled" necklace which invades everyone's privacy and provides Internet reply "commentary" in response. As if to underline its sheer shittiness, WIRED has reported that even other promptfondlers are repulsed by it, in a scathing review that accidentally sneers its techbro shithead inventor:

If you're looking for some quick schadenfreude, here's the quotes on Bluesky.

load more comments (2 replies)
[–] gerikson@awful.systems 10 points 2 weeks ago (1 children)
load more comments (1 replies)
[–] BlueMonday1984@awful.systems 10 points 3 weeks ago (1 children)

Found two separate AI-related links for today.

First, AI slop corpo Apiiro put out a study stating the obvious (that AI is a cybersecurity nightmare), and tried selling its slop agents as the solution. Apiiro was using their own slop-bots to do the study, too, so I'm taking all this with a major grain of salt.

Second, I came across an AI-themed Darwin Awards spinoff cataloguing various comical fuck-ups caused through the slop-bots.

[–] alan@mindly.social 10 points 3 weeks ago

@BlueMonday1984 Man, I hope that grain of salt is never on a collision course with the Earth's orbit. 😉

[–] blakestacey@awful.systems 9 points 2 weeks ago

The Grauniad has a new piece today about the underpaid human labor on which the "AI" industry depends:

https://www.theguardian.com/technology/2025/sep/11/google-gemini-ai-training-humans

Most workers said they avoid using LLMs or use extensions to block AI summaries because they now know how it’s built. Many also discourage their family and friends from using it, for the same reason.

[–] Architeuthis@awful.systems 9 points 2 weeks ago (2 children)

Some quality wordsmithing found in the wild:

transcript@MosesSternstein (quote-twitted): AI-Capex is the everything cycle, now.

Just under 50% of GDP growth is attributable to AI Capex

@bigblackjacobin: Almost certainly the greatest misallocation of capital you or I will ever see. There's no justification for this however you cut it but the beatings will continue until a stillborn god is born.

load more comments (2 replies)
load more comments
view more: next ›