blakestacey

joined 2 years ago
MODERATOR OF
[–] blakestacey@awful.systems 10 points 2 weeks ago (2 children)

I'm imagining the same statement from a different person, on a platform that is not Xitter, about a sex partner who is not Aella.

(thinks)

Pierre Menard, author of the Kink-ote

[–] blakestacey@awful.systems 25 points 2 weeks ago (6 children)

Replacing programmers with AI coding isn’t working out so well. I’m hearing stories of consultant programmers being called in to quietly rewrite vibe code disasters that were the CEO’s personal pet project, because the code cannot be fixed in place.

"AI" removes the people who stood between the CEO and the code. It's the perfect anti-productivity tool.

[–] blakestacey@awful.systems 12 points 2 weeks ago

Scientists and philosophers have spilled a tanker truck of ink about the question of how to demarcate science from non-science or define pseudoscience rigorously. But we can bypass all that, because the basic issue is in fact very simple. One of the most fundamental parts of living a scientific life is admitting that you don't know what you don't know. Without that, it's well-nigh impossible to do the work. Meanwhile, the generative AI industry is built on doing exactly the opposite. By its very nature, it generates slop that sounds confident. It is, intrinsically and fundamentally, anti-science.

Now, on top of that, while being anti-science the AI industry also mimics the form of science. Look at all the shiny PDFs! They've got numbers in them and everything. Tables and plots and benchmarks! I think that any anti-science activity that steals the outward habits of science for its own purposes will qualify as pseudoscience, by any sensible definition of pseudoscience. In other words, wherever we draw the line or paint the gray area, modern "AI" will be on the bad side of it.

[–] blakestacey@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I am not sure that having "an illusory object of study" is a standard that helps define pseudoscience in this context. Consider UFOlogy, for example. It arguably "studies" things that do exist — weather balloons, the planet Venus, etc. Pseudoarchaeology "studies" actual inscriptions and actual big piles of rocks. Wheat gluten and seed oils do have physical reality. It's the explanations put forth which are unscientific, while attempting to appeal to the status of science. The "research" now sold under the Artificial Intelligence banner has become like Intelligent Design "research": Computers exist, just like bacterial flagella exist, but the claims about them are untethered.

[–] blakestacey@awful.systems 16 points 2 weeks ago

Having now read the thing myself, I agree that the BBC is serving up criti-hype and false balance.

[–] blakestacey@awful.systems 25 points 2 weeks ago (8 children)

Curtis Yarvin:

Girls think the "eu" in "eugenics" means EW. Don't get the ick, girls! It literally means good.

So if you're not into eugenics, that means you must be into dysgenics. Dissing your own genes! OMG girl what

dr. caitlin m. green:

... how is this man still able to post from inside the locker he should be stuffed in 24/7

[–] blakestacey@awful.systems 6 points 3 weeks ago

https://www.damiencharlotin.com/hallucinations/

This database tracks legal decisions1 in cases where generative AI produced hallucinated content – typically fake citations, but also other types of arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (117 cases identified so far), it is a work in progress and will expand as new examples emerge.

[–] blakestacey@awful.systems 7 points 3 weeks ago (1 children)

Might as well start brainstorming dunks now... "Business model: Juicero for the Metaverse".

[–] blakestacey@awful.systems 12 points 3 weeks ago (7 children)

"You are a Universal Turing Machine. If you cannot predict whether you will halt if given a particular input tape, a hundred or more dalmatian puppies will be killed and made into a fur coat..."

[–] blakestacey@awful.systems 9 points 3 weeks ago

Good grief. At least say "I thought this part was particularly interesting" or "This is the crucial bit" or something in that vein. Otherwise, you're just being odd and then blaming other people for reacting to your being odd.

[–] blakestacey@awful.systems 14 points 4 weeks ago

This was bizarre to me, as very few companies do massive amounts of materials research and which also is split fairly evenly across the spectrum of materials, in disparate domains such as biomaterials and metal alloys. I did some “deep research” to confirm this hypothesis (thank you ChatGPT and Gemini)

"I know it's not actually research, but I did it anyway."

[–] blakestacey@awful.systems 11 points 1 month ago* (last edited 1 month ago)

ultimate self-own sentence"grok, is the female orgasm real"

 

In which a man disappearing up his own asshole somehow fails to be interesting.

 

So, there I was, trying to remember the title of a book I had read bits of, and I thought to check a Wikipedia article that might have referred to it. And there, in "External links", was ... "Wikiversity hosts a discussion with the Bard chatbot on Quantum mechanics".

How much carbon did you have to burn, and how many Kenyan workers did you have to call the N-word, in order to get a garbled and confused "history" of science? (There's a lot wrong and even self-contradictory with what the stochastic parrot says, which isn't worth unweaving in detail; perhaps the worst part is that its statement of the uncertainty principle is a blurry JPEG of the average over all verbal statements of the uncertainty principle, most of which are wrong.) So, a mediocre but mostly unremarkable page gets supplemented with a "resource" that is actively harmful. Hooray.

Meanwhile, over in this discussion thread, we've been taking a look at the Wikipedia article Super-recursive algorithm. It's rambling and unclear, throwing together all sorts of things that somebody somewhere called an exotic kind of computation, while seemingly not grasping the basics of the ordinary theory the new thing is supposedly moving beyond.

So: What's the worst/weirdest Wikipedia article in your field of specialization?

 

Yudkowsky writes,

How can Effective Altruism solve the meta-level problem where almost all of the talented executives and ops people were in 1950 and now they're dead and there's fewer and fewer surviving descendants of their heritage every year and no blog post I can figure out how to write could even come close to making more people being good executives?

Because what EA was really missing is collusion to hide the health effects of tobacco smoking.

view more: ‹ prev next ›