Perspectivist

joined 2 weeks ago
[–] Perspectivist@feddit.uk 2 points 2 hours ago (1 children)

It’s not to protect it from cracking - it’s to stop the leftover coffee from burning onto it, since I only rinse it after use.

[–] Perspectivist@feddit.uk 1 points 2 hours ago

I don't waste good coffee.

[–] Perspectivist@feddit.uk 1 points 2 hours ago (3 children)

It's intentional. Leaves an air gap between the pot and the hotplate.

[–] Perspectivist@feddit.uk 4 points 7 hours ago

When I make coffee just for myself, I always measure out the same amount of water and this never happens. But my SO is slightly less autistic about it than I am and makes inconsistent amounts when brewing for the two of us - and I just can’t stand the thought of pouring even a drop of coffee down the drain. So, I spill it on the table and floor instead.

[–] Perspectivist@feddit.uk 2 points 7 hours ago (2 children)

I live in a small granny cottage and "my desk" means the kitchen table 2.5 meters away. I technically could move it to my desk and it would still remain in the kitchen.

[–] Perspectivist@feddit.uk 1 points 10 hours ago* (last edited 9 hours ago)

The level of consciousness in something like a brain parasite or a slug is probably so dim that it barely feels like anything to be one. So even if you were reincarnated as one, you likely wouldn’t have much of a subjective experience of it. The only way to really experience a new life after reincarnation would be to come back as something with a complex enough mind to actually have a vivid sense of existence. Not that it matters much - it’s not like you’d remember any of your past lives anyway.

If reincarnation were real and I had to bet money on how it works, I’d put it down to something like the many‑worlds interpretation of quantum physics - where being “reborn as yourself” just means living out one of your alternate timelines in a parallel universe.

[–] Perspectivist@feddit.uk 5 points 10 hours ago (1 children)

Or maybe I just need to start drinking straight from the jug.

[–] Perspectivist@feddit.uk 11 points 10 hours ago

Moccamaster is relatively popular brand where I live. Most people know about it. It always boggles my mind when I see a middle class family with a 35€ coffee maker. Why cheap out on something you're using multiple times a day for the rest of your life? These things are not that expensive and spare parts are widely available.

[–] Perspectivist@feddit.uk 0 points 10 hours ago

I mean, honestly this is one of the better uses for machine learning. Not that this age checking is a good thing but if you're going to do it on a mass scale then this seems like the right approach. I imagine that especially for a relatively heavy user this is going to be extremely accurate and far better than the alternative of providing a selfie let alone picture of an ID.

[–] Perspectivist@feddit.uk 3 points 11 hours ago* (last edited 10 hours ago) (2 children)

Just a few days ago I was in a discussion about misgendering people online, and a lemmy.world mod who was advocating against it didn’t like my theory about why some people do it. Their response was to attack me personally by intentionally misgendering me and calling me a “lady.”

I’m not sure what their reasoning was there, but the only way I can interpret it is that they see women as the lesser sex - so by implying I’m not a guy, they’re saying I’m lesser as well.

231
submitted 11 hours ago* (last edited 11 hours ago) by Perspectivist@feddit.uk to c/mildlyinfuriating@lemmy.world
 

Now how am I supposed to get this to my desk without either spilling it all over or burning my lips trying to slurp it here. I've been drinking coffee for at least 25 years and I still do this to myself at least 3 times a week.

[–] Perspectivist@feddit.uk 29 points 1 day ago (1 children)

Now that I think of it, the Finnish translation for this is "purkkapatentti" which translates to "chewing gum patent"

137
submitted 1 day ago* (last edited 1 day ago) by Perspectivist@feddit.uk to c/til@lemmy.world
 

A kludge or kluge is a workaround or makeshift solution that is clumsy, inelegant, inefficient, difficult to extend, and hard to maintain. Its only benefit is that it rapidly solves an important problem using available resources.

[–] Perspectivist@feddit.uk 3 points 1 day ago

Is "AI slop" synonymous with AI content in general? I've always thought it to mean bad AI content specifically.

I don't consider myself neurotypical yet I see our current AI progress as net-positive. I don't like AI slop either in the sense that I understand the term but I've encountered a lot of good AI generated content.

 

I’m having a really odd issue with my e‑fatbike (Bafang M400 mid‑drive). When I’m on the two largest cassette cogs (lowest gears), the motor briefly cuts power once per crank revolution. It’s a clean on‑off “tick,” almost like the system thinks I stopped pedaling for a split second.

I first noticed this after switching from a 38T front chainring to a 30T. At that point it only happened on the largest cog, never on the others.

I figured it might be caused by the undersized chainring, so I put the original back in and swapped the original 1x10 drivetrain for a 1x11 and went from a 36T largest cog to a 51T. But no - the issue still persists. Now it happens on the largest two cogs. Whether I’m soft‑pedaling or pedaling hard against the brakes doesn’t seem to make any difference. It still “ticks” once per revolution.

I’m out of ideas at this point. Torque sensor, maybe? I have another identical bike with a 1x12 drivetrain and an 11–50T cassette, and it doesn’t do this, so I doubt it’s a compatibility issue. Must be something sensor‑related? With the assist turned off everything runs perfectly, so it’s not mechanical.

EDIT: Upon further inspection it seem that the moment the power cuts out seems to perfectly sync with the wheel speed magnet going past the sensor on the chainstay so I'm like 95% sure that a faulty wheel speed sensor is the issue here. I have a spare part ordered so I'm not sure yet but unless there's a 2nd update to this then it solved the issue.

 

I see a huge amount of confusion around terminology in discussions about Artificial Intelligence, so here’s my quick attempt to clear some of it up.

Artificial Intelligence is the broadest possible category. It includes everything from the chess opponent on the Atari to hypothetical superintelligent systems piloting spaceships in sci-fi. Both are forms of artificial intelligence - but drastically different.

That chess engine is an example of narrow AI: it may even be superhuman at chess, but it can’t do anything else. In contrast, the sci-fi systems like HAL 9000, JARVIS, Ava, Mother, Samantha, Skynet, or GERTY are imagined as generally intelligent - that is, capable of performing a wide range of cognitive tasks across domains. This is called Artificial General Intelligence (AGI).

One common misconception I keep running into is the claim that Large Language Models (LLMs) like ChatGPT are “not AI” or “not intelligent.” That’s simply false. The issue here is mostly about mismatched expectations. LLMs are not generally intelligent - but they are a form of narrow AI. They’re trained to do one thing very well: generate natural-sounding text based on patterns in language. And they do that with remarkable fluency.

What they’re not designed to do is give factual answers. That it often seems like they do is a side effect - a reflection of how much factual information was present in their training data. But fundamentally, they’re not knowledge databases - they’re statistical pattern machines trained to continue a given prompt with plausible text.

 

I was delivering an order for a customer and saw some guy messing with the bikes on a bike rack using a screwdriver. Then another guy showed up, so the first one stopped, slipped the screwdriver into his pocket, and started smoking a cigarette like nothing was going on. I was debating whether to report it or not - but then I noticed his jacket said "Russia" in big letters on the back, and that settled it for me.

That was only the second time in my life I’ve called the emergency number.

view more: next ›