blakestacey

joined 2 years ago
MODERATOR OF
[–] blakestacey@awful.systems 3 points 1 week ago (1 children)

"I’m a moderate Holocaust denier." —Curtis Yarvin

[–] blakestacey@awful.systems 7 points 1 week ago (3 children)

wake up babe, new Yud profile pic just dropped

(And by "just" I mean "sometime in the past three weeks or so". I don't skim his exTwitter feed for sneerables very often.)

[–] blakestacey@awful.systems 14 points 1 week ago (1 children)

Typo:

Thorat didn’t look hrough his “own” book either

[–] blakestacey@awful.systems 12 points 1 week ago* (last edited 1 week ago) (2 children)

It would appear CNN was also at the eugenics conference? Why are all these mainstream news orgs at a 200-person event where all the speakers are eugenicists and racists?

https://bsky.app/profile/bmceuen.bsky.social/post/3lmmtefdl422j

And in response to an Atlantic subhead saying "Perpetuating humanity should be a cross-politics consensus, but the left was mostly absent at a recent pro-natalism conference":

yeah, weird that the left wasn’t present at the Fourteen Words conference

https://bsky.app/profile/jamellebouie.net/post/3lmmqjx3fdc2e

[–] blakestacey@awful.systems 13 points 1 week ago (1 children)

yet I hold

space for it

[–] blakestacey@awful.systems 7 points 1 week ago

As a wise friend of mine said years ago, when hipsters drinking PBR were having a cultural moment, "You can say you're drinking piss beer 'ironically', but at the end of the day, you're still drinking piss beer."

[–] blakestacey@awful.systems 6 points 1 week ago

Having read all the Asimov novels when I was younger....

spoilerThe Caves of Steel: human killed because he was mistaken for the android that he built in his own image.

The Robots of Dawn: robot killed (positronic brain essentially bricked) to prevent it from revealing the secrets of how to build robots that can pass for human. It had been a human's sex partner, but that wasn't the motive. No one thought banging a robot was that strange; the only thing that perturbed them was the human getting emotional fulfillment from it (the planet Aurora is a decadent world where sex is for entertainment and fashion, not relationships).

The Naked Sun: the villain manipulates robots to commit crimes by having multiple robots each do a part of the task, so that the "a robot shall not harm a human being" software directive is never activated. He tries to poison a man by having one robot dose a water carafe and another unknowingly pour from it, but being a poisoning noob, he screws up the dosage and the victim lives. His only successful murder involves a human as well; he programs a robot to hand a blunt object to a human during a violent quarrel with the intended victim.

[–] blakestacey@awful.systems 22 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

"Conspiracy" is a colorful way of describing what might boil down to Gagniuc and two publicists, or something like that, since one person can hop across multiple IP addresses, etc. But, I mean, a pitifully tiny conspiracy still counts (and is, IMO, even funnier).

A comment by Wikipedia editor David Eppstein, theoretical computer science prof at UC Irvine:

Despite Malparti warning that "it would be a waste of time for everyone" I took a look at the book myself. 60 pages of badly-worded boring worked examples with no theory before we even get to the possibility of having more than two states. As Malparti said, there is no theory, or rather theory is alluded to in vague and inaccurate form without any justification. For instance the steady state (still of a two-state chain) is first mentioned on 46 as "the unique solution" to an equilibrium equation, and is stated to be "eventually achieved", with no discussion of exceptional cases where the solution is not unique or not reached in the limit, and no discussion of the fact that it is never actually achieved, only found in the limit. Do not use for anything. I should have taken the fact that I could not find a review even on MR and zbl as a warning.

It's been a while since I've seen a math book review that said "Do not use for anything."

"This book is not a place of honor..."

[–] blakestacey@awful.systems 17 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Sometimes, checking the Talk page of a Wikipedia article can be entertaining.

https://en.wikipedia.org/wiki/Talk:Markov_chain#Proposal_to_reintroduce_peer-reviewed_source_(Wiley,_2017)

In short: There has been a conspiracy to insert citations to a book by a certain P. Gagniuc into Wikipedia. This resulted in said book gaining about 900 citations on Google Scholar from people who threw in a footnote for the definition of a Markov chain. The book, Markov Chains: From Theory to Implementation and Experimentation (2017), is actually really bad. Some of the comments advocating for its inclusion read like chatbot (bland, generic, lots of bullet points). Another said that it should be included because it's "the most reliable book on the subject, and the one that is part of ChatGPT training set".

This has been argued out over at least five different discussion pages.

[–] blakestacey@awful.systems 7 points 2 weeks ago (1 children)

Autocorrect-ism for "metaverse", perhaps?

[–] blakestacey@awful.systems 10 points 2 weeks ago (2 children)

Sheesh. Everyone knows you keep the phenethylamines inside the fridge proper, not on the door, where the temperature is less stable. (Source: the Shulgins' Kitchen Procedures I Have Known And Loved.)

 

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Bumping this up from the comments.

 

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

Many magazines have closed their submission portals because people thought they could send in AI-written stories.

For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you.

With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

 

Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.

 

[Eupalinos of Megara appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

 

In October, New York City announced a plan to harness the power of artificial intelligence to improve the business of government. The announcement included a surprising centerpiece: an AI-powered chatbot that would provide New Yorkers with information on starting and operating a business in the city.

The problem, however, is that the city’s chatbot is telling businesses to break the law.

 

a lesswrong: 47-minute read extolling the ambition and insights of Christopher Langan's "CTMU"

a science blogger back in the day: not so impressed

[I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.”

Langan, incidentally, is a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides.

 

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

 

If you've been around, you may know Elsevier for surveillance publishing. Old hands will recall their running arms fairs. To this storied history we can add "automated bullshit pipeline".

In Surfaces and Interfaces, online 17 February 2024:

Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2].

In Radiology Case Reports, online 8 March 2024:

In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice.

Edit to add this erratum:

The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision.

Edit again to add this article in Urban Climate:

The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality

And this one in Energy:

Certainly, here are some potential areas for future research that could be explored.

Can't forget this one in TrAC Trends in Analytical Chemistry:

Certainly, here are some key research gaps in the current field of MNPs research

Or this one in Trends in Food Science & Technology:

Certainly, here are some areas for future research regarding eggplant peel anthocyanins,

And we mustn't ignore this item in Waste Management Bulletin:

When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider.

The authors of this article in Journal of Energy Storage seems to have used GlurgeBot as a replacement for basic formatting:

Certainly, here's the text without bullet points:

 

In which a man disappearing up his own asshole somehow fails to be interesting.

view more: ‹ prev next ›