this post was submitted on 14 Aug 2025
751 points (98.8% liked)

Science Memes

16232 readers
2985 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] nectar45@lemmy.zip 17 points 14 hours ago (2 children)

I see we live in the timeline where virginity won

[–] tetris11@feddit.uk 9 points 14 hours ago

We're all wizards and witches now!

load more comments (1 replies)
[–] Zerush@lemmy.ml 18 points 14 hours ago (6 children)

I've seen people which have a relationship with their Tom Tom Go and even with the car itself

https://www.jalopnik.com/my-strange-addiction-sex-car-guy-back-with-lexus-es330-1850642715/

load more comments (6 replies)
[–] bluesheep@sh.itjust.works 27 points 16 hours ago (1 children)

tulpa

Now thats a word i haven't read in a long time

[–] cannon_annon88@lemmy.today 9 points 16 hours ago (1 children)

It's my first time. gotta look it up.

[–] not_woody_shaw@lemmy.world 13 points 15 hours ago (1 children)

I'm afraid I'm too lazy to look it up for myself. Perhaps I'm not in the lucky 10,000.

[–] Madison420@lemmy.world 15 points 15 hours ago (1 children)

Sort of an icon or being you build in your mind and grant power by letting live in your mind.

load more comments (1 replies)
[–] Fizz@lemmy.nz 114 points 20 hours ago (2 children)

"My husband is voice his own thoughts without prompts."

She then posts a picture of her saying "what are you thinking about"

Thats a direct response to the prompt hes not randomly voicing his thoughts. I hate ai but sometimes I hate people to

[–] mitch@piefed.mitch.science 49 points 19 hours ago (5 children)

FWIW, this is why AI researchers have been screeching for decades not to create an AI that is anthropomorphized. It is already an issue we have with animals, now we are going to add a confabulation engine to the ass-end?

[–] Cethin@lemmy.zip 5 points 12 hours ago

People have this issue with video game characters who don't even pretend to have intelligence. This could only go wrong.

[–] Jankatarch@lemmy.world 12 points 15 hours ago (1 children)

Yeah apparently even Eliza messed up with people back in the day and that's not even an LLM.

[–] Feathercrown@lemmy.world 15 points 15 hours ago (2 children)

I'm starting to realize how easily fooled people are by this stuff. The average person cannot be this stupid, and yet, they are.

[–] slaneesh_is_right@lemmy.org 13 points 14 hours ago (2 children)

I was once in a restaurant and behind me was a group of 20 something year old people. Overheard someone asking something like:"so what are y'alls thoughts about VR? (This was just before the whole AI boom.) And one guy said:"ith's kind of scary to think about." I was super confused at that point, and they talked about how they heard people disappear in the cyberspace and people not knowing what's real and what's just VR.

I don't think they were stupid, but they formed a very strong opinion about something they clearly didn't know anything about.

[–] tomatoely@sh.itjust.works 9 points 14 hours ago (1 children)

I'd like to believe he heard a summary of sword art online's plot and thought it was real

[–] Jankatarch@lemmy.world 5 points 13 hours ago

Wait it's not? But matrix!

load more comments (1 replies)
load more comments (1 replies)
load more comments (3 replies)
[–] psx_crab@lemmy.zip 31 points 20 hours ago (1 children)

The worst thing about AI is the people.

load more comments (1 replies)
[–] Bosht@lemmy.world 15 points 15 hours ago (3 children)

So so tired of how utterly fucked things are getting on so many levels at once. More and more I think I really do need to invest in a back 50 acre lot and try the survival route while society just fucks itself into oblivion.

[–] Jason2357@lemmy.ca 8 points 14 hours ago

This kind of thins is just moral panic. Funny moral panic, but still pointless. There's always been a tiny fraction of the population that is completely out to lunch, and there always will be.

load more comments (2 replies)

There are so, so many horrifying ethical issues about this whole thing. What the fuck.

[–] Binette@lemmy.ml 12 points 14 hours ago (1 children)

I don't think schizoid is the best word to describe this behaviour

[–] dejected_warp_core@lemmy.world 12 points 14 hours ago (1 children)

Honestly, cringy nomenclature aside, this is just porn that got a little too real. Some people are into the narrative, after all.

To me the story begins and ends with some user that thinks the LLM sounds a little too life-like. Play with these things enough, and they'll crawl out of the uncanny valley a bit from time to time. Trick is: that's all in your head. Yeah, it might screw up your goon session and break the role-play a bit, but it's not about to go all SkyNet on you.

[–] rozodru@lemmy.world 10 points 14 hours ago (1 children)

The building that has my workspace has this great food court/library/work hybrid area where people who work remotely tend to go. a sort of third space. It has fantastic free wifi so it makes sense why people would use it and sit there all day working.

Everyday there's this older guy who sits there talking to his phone about some of the most random subjects ever. I originally thought he was just talking to a friend that seemed to have extensive knowledge on everything until one day I walked by him and glanced to see that he was talking to chatgpt. Everyday. Just random conversations. Even had a name for it, "Ryan".

Now? he's frustrated. He doesn't know what happened to Ryan and keeps screaming at his phone to "bring Ryan back!" or since GPT5 can't maintain a conversation anymore it's "You're not Ryan!". Granted the guy wasn't mentally all there to begin with but now it's spiraling. Got to the point yesterday he was yelling so loudly at his phone security had to tell him to leave.

[–] dejected_warp_core@lemmy.world 4 points 12 hours ago (1 children)

Jebus that's awful. OpenAI took away his friend.

[–] uuldika@lemmy.ml 5 points 11 hours ago (1 children)

happened with Replika a few years ago. made a number of people suicidal when they "neutered" their AI partners overnight with a model update (ironically, because of pressure because of how unhealthy it is.)

idk, I'm of two minds. it's sad and unhealthy to have a virtual best friend, but older people are often very lonely and a surrogate is better than nothing.

load more comments (1 replies)
[–] Klear@quokk.au 141 points 22 hours ago* (last edited 22 hours ago) (12 children)

I kinda like the word "wireborn". If only it wasn't attached to a concept that's equal parts stupid and sad =/

load more comments (12 replies)
[–] ZkhqrD5o@lemmy.world 42 points 19 hours ago* (last edited 19 hours ago) (6 children)

One thing that comes to mind is that prostitution, no matter how you spin it, is still a social job. If you get a problematic person like that in prostitution, there are good chances that said prostitute would be able to talk their customer out of doing some nonsense. If not for empathy, for the simple fact that there would be legal consequences for not doing so.

Do you think a glorified spreadsheet that people call husband would behave the same? Don't know if it happened but one of these days LLMs will talk people into doing something very nasty and then it's going to be no one's fault again, certainly not the host of the LLM. We really live in a boring dystopia.

Edit: Also there's this one good movie which I forgot the name of, about a person talking to one of these LLMs as a girlfriend. They have a bizarre, funny and simultaneously creepy and disturbing scene where the main character who's in love with the LLM, hires a woman who puts a camera on her forehead to have sex with his LLM "girlfriend".

Also, my quite human husband also voices his thoughts without a prompt. Lol. You only need to feed him to function, no internet required.

[–] bitjunkie@lemmy.world 16 points 17 hours ago

The movie you're thinking of is Her with Joaquin Phoenix and Scarlett Johansson, and in the story she's a true general AI.

[–] humanspiral@lemmy.ca 14 points 16 hours ago

A problem with LLM relationships is the monetization model for the LLM. Its "owner" either receives a monthly fee from the user, or is able to get data from the user to monetize selling them stuff. So the LLM is deeply dependant on the user, and is motivated to manipulate a returned codependency to maintain its income stream. This is not significantly different than the therapy model, but the user can fail to see through manipulation compared to "friends/people who don't actually GAF" about maintaining a strong relationship with you.

[–] AbsolutelyNotAVelociraptor@sh.itjust.works 18 points 18 hours ago (1 children)

Also, my quite human husband also voices his thoughts without a prompt. Lol. You only need to feed him to function, no internet required.

Sometimes, with humans, I'd say the problem is quite the opposite: they voice their thoughts without a prompt far more often than what would be desirable.

On a less serious note, that quoted part made me chuckle.

load more comments (1 replies)
load more comments (3 replies)
[–] napkin2020@sh.itjust.works 16 points 16 hours ago (1 children)
load more comments (1 replies)
[–] peoplebeproblems@midwest.social 37 points 19 hours ago (5 children)

How does anyone enjoy this? It doesn't even feel real. No spelling mistakes? What the fuck is a skycot?

I may have never had a match on a dating app that wasn't a cryptobot or only fans girl, but I also don't swipe right on every single woman on it. You'd think my loneliness would attempt me to try and pretend it was real or something, but it just doesn't work.

LLMs are going to make the world stupid, I guarantee it.

[–] MonkeMischief@lemmy.today 16 points 17 hours ago

This reminds me of the people who genuinely fall for "romance scams", and the scammer has all the personality and vocabulary of a wet paper bag.

And yet somehow someone will believe they're some hot (barely literate) U.S soldier stuck in Kuwait until they can get a flight home to meet the victim for only $2000... Wait, $1000 more... But then there's a $500 fee... And then...

Blows my mind...

load more comments (4 replies)
[–] AbsolutelyNotAVelociraptor@sh.itjust.works 86 points 22 hours ago* (last edited 20 hours ago) (38 children)

I heard about this in the radio the other day. People pay a monthly fee for an AI that becomes your "digital partner".

The reasoning behind, according to them, is that the AI is less dangerous than a human partner because they can't cheat, can't abuse you...

And I can't but wonder where did we take the wrong turn to end up here. Because while I can understand that people can go through some traumatic shit that would made them wary of the opposite sex, considering a machine your sentimental partner can only lead to some extremely fucked up scenarios.

[–] Lucidlethargy@sh.itjust.works 1 points 10 hours ago

I'd rather choose a bear than choose an AI...

load more comments (37 replies)
[–] IAmNorRealTakeYourMeds@lemmy.world 58 points 21 hours ago (1 children)

we live though a serious loneliness epidemic.

and capitalism figured out how to exploit it

[–] Kellenved@sh.itjust.works 41 points 20 hours ago (1 children)

Capitalism created it too, classic create problem and sell solution model. Fucking ghoulish

load more comments (1 replies)
[–] jubilationtcornpone@sh.itjust.works 34 points 20 hours ago (14 children)

I think the decline of organized religion and things like fraternal orders (Elks, Moose, Shriners, etc.) have probably contributed a lot to the loneliness epidemic. There are a lot of other extenuating factors but those two things were once foundational to social circles in the US.

[–] scrubbles@poptalk.scrubbles.tech 53 points 19 hours ago

Don't forget that in the US we also have built our towns and cities to be isolating. Most don't walk home from work, pop into their local bar/coffee shop/park to see their neighbors and then finish their walk home. We get in our car alone, drive home where then going out means getting back in a car, and stopping on the way home means figuring out drivers and parking and meetups.

We lost our third places and now we wonder why we don't know our neighborhood as well

[–] shalafi@lemmy.world 17 points 18 hours ago

“OK, now let’s have some fun. Let’s talk about sex. Let’s talk about women. Freud said he didn’t know what women wanted. I know what women want. They want a whole lot of people to talk to. What do they want to talk about? They want to talk about everything.

What do men want? They want a lot of pals, and they wish people wouldn’t get so mad at them.

Why are so many people getting divorced today? It’s because most of us don’t have extended families anymore. It used to be that when a man and a woman got married, the bride got a lot more people to talk to about everything. The groom got a lot more pals to tell dumb jokes to.

A few Americans, but very few, still have extended families. The Navahos. The Kennedys.

But most of us, if we get married nowadays, are just one more person for the other person. The groom gets one more pal, but it’s a woman. The woman gets one more person to talk to about everything, but it’s a man.

When a couple has an argument, they may think it’s about money or power or sex, or how to raise the kids, or whatever. What they’re really saying to each other, though, without realizing it, is this: “You are not enough people!”

I met a man in Nigeria one time, an Ibo who has six hundred relatives he knew quite well. His wife had just had a baby, the best possible news in any extended family.

They were going to take it to meet all its relatives, Ibos of all ages and sizes and shapes. It would even meet other babies, cousins not much older than it was. Everybody who was big enough and steady enough was going to get to hold it, cuddle it, gurgle to it, and say how pretty it was, or handsome.

Wouldn't you have loved to be that baby?”

― Kurt Vonnegut, God Bless You, Dr. Kevorkian

load more comments (12 replies)
load more comments
view more: ‹ prev next ›