blakestacey

joined 2 years ago
MODERATOR OF
[–] blakestacey@awful.systems 20 points 3 weeks ago

After understanding a lot of things it’s clear that it didn’t. And it fooled me for two weeks.

I have learned my lesson and now I am using it to generate one page at a time.

qu1j0t3 replies:

that's, uh, not really the ideal takeaway from this lesson

[–] blakestacey@awful.systems 21 points 4 weeks ago* (last edited 4 weeks ago) (5 children)

We—yes, even you—are using some version of AI, or some tools that have LLMs or machine learning in them in some way shape or form already

Fucking ghastly equivocation. Not just between "LLMs" and "machine learning", but between opening a website that has a chatbot icon I never click and actually wasting my time asking questions to the slop machine.

[–] blakestacey@awful.systems 13 points 4 weeks ago (1 children)

I like how quoting Grimes lyrics makes the banality of these people thuddingly clear.

[–] blakestacey@awful.systems 26 points 4 weeks ago* (last edited 4 weeks ago) (4 children)

Yud:

ChatGPT has already broken marriages, and hot AI girls are on track to remove a lot of men from the mating pool.

And suddenly I realized that I never want to hear a Rationalist say the words "mating pool".

(I fired up xcancel to see if any of the usual suspects were saying anything eminently sneerable. Yudkowsky is re-xitting Hanania and some random guy who believes in g. Maybe he should see if the Pioneer Fund will bankroll publicity for his new book....)

[–] blakestacey@awful.systems 12 points 4 weeks ago

Previously sneered:

The context for this essay is serious, high-stakes communication: papers, technical blog posts, and tweet threads.

More recently, in the comments:

After reading your comments and @Jiro 's below, and discussing with LLMs on various settings, I think I was too strong in saying....

It's like watching people volunteer for a lobotomy.

[–] blakestacey@awful.systems 10 points 1 month ago

Kaneda just scooting to the side at the 14:05 mark like a Looney Tunes character caught with their pants down is comedy gold. I want to loop it with a MIDI rendition of Joplin's "The Entertainer".

[–] blakestacey@awful.systems 10 points 1 month ago

The top comment begins thusly:

I think you make a reasonably compelling case, but when I think about the practicality of this in my own life it's pretty hard to imagine not spending any time talking to chatbots. ChatGPT, Claude and others are extremely useful.

I didn't think it was possible, but the perfection continues! Still, no notes!

[–] blakestacey@awful.systems 15 points 1 month ago (2 children)

This N is too small: ~N~

[–] blakestacey@awful.systems 22 points 1 month ago (2 children)

Paragraph 2:

METR funded 16 experienced open-source developers with “moderate AI experience” to do what they do.

[–] blakestacey@awful.systems 18 points 1 month ago (3 children)

https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/

When developers are allowed to use AI tools, they take 19% longer to complete issues—a significant slowdown that goes against developer beliefs and expert forecasts. This gap between perception and reality is striking: developers expected AI to speed them up by 24%, and even after experiencing the slowdown, they still believed AI had sped them up by 20%.

womp, hold on let me finish, womp

[–] blakestacey@awful.systems 16 points 1 month ago (2 children)

https://www.lesswrong.com/posts/JspxcjkvBmye4cW4v/asking-for-a-friend-ai-research-protocols

Multiple people are quietly wondering if their AI systems might be conscious. What's the standard advice to give them?

Touch grass. Touch all the grass.

[–] blakestacey@awful.systems 14 points 1 month ago

Dorkus malorkus alert:

When my grandmother quit being a nurse to become a stay at home mother, it was seen like a great thing. She gained status over her sisters, who stayed single and in their careers.

Fitting into your societal pigeonhole is not the same as gaining status, ya doofus.

 

If you've been around, you may know Elsevier for surveillance publishing. Old hands will recall their running arms fairs. To this storied history we can add "automated bullshit pipeline".

In Surfaces and Interfaces, online 17 February 2024:

Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2].

In Radiology Case Reports, online 8 March 2024:

In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice.

Edit to add this erratum:

The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision.

Edit again to add this article in Urban Climate:

The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality

And this one in Energy:

Certainly, here are some potential areas for future research that could be explored.

Can't forget this one in TrAC Trends in Analytical Chemistry:

Certainly, here are some key research gaps in the current field of MNPs research

Or this one in Trends in Food Science & Technology:

Certainly, here are some areas for future research regarding eggplant peel anthocyanins,

And we mustn't ignore this item in Waste Management Bulletin:

When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider.

The authors of this article in Journal of Energy Storage seems to have used GlurgeBot as a replacement for basic formatting:

Certainly, here's the text without bullet points:

 

In which a man disappearing up his own asshole somehow fails to be interesting.

 

So, there I was, trying to remember the title of a book I had read bits of, and I thought to check a Wikipedia article that might have referred to it. And there, in "External links", was ... "Wikiversity hosts a discussion with the Bard chatbot on Quantum mechanics".

How much carbon did you have to burn, and how many Kenyan workers did you have to call the N-word, in order to get a garbled and confused "history" of science? (There's a lot wrong and even self-contradictory with what the stochastic parrot says, which isn't worth unweaving in detail; perhaps the worst part is that its statement of the uncertainty principle is a blurry JPEG of the average over all verbal statements of the uncertainty principle, most of which are wrong.) So, a mediocre but mostly unremarkable page gets supplemented with a "resource" that is actively harmful. Hooray.

Meanwhile, over in this discussion thread, we've been taking a look at the Wikipedia article Super-recursive algorithm. It's rambling and unclear, throwing together all sorts of things that somebody somewhere called an exotic kind of computation, while seemingly not grasping the basics of the ordinary theory the new thing is supposedly moving beyond.

So: What's the worst/weirdest Wikipedia article in your field of specialization?

 

Yudkowsky writes,

How can Effective Altruism solve the meta-level problem where almost all of the talented executives and ops people were in 1950 and now they're dead and there's fewer and fewer surviving descendants of their heritage every year and no blog post I can figure out how to write could even come close to making more people being good executives?

Because what EA was really missing is collusion to hide the health effects of tobacco smoking.

view more: ‹ prev next ›