this post was submitted on 02 Jul 2025
286 points (97.7% liked)

Technology

72263 readers
2757 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

top 50 comments
sorted by: hot top controversial new old
[–] cupcakezealot@piefed.blahaj.zone 2 points 32 minutes ago

probably because there's a rapist in the white house.

[–] 2ugly2live@lemmy.world 12 points 6 hours ago (1 children)

God I'm glad I'm not a kid now. I never would have survived.

[–] jjlinux@lemmy.ml 1 points 5 minutes ago

In my case, other kids would not have survived trying to pull off shit like this. So yeah, I'm also glad I'm not a kid anymore.

[–] Daft_ish@lemmy.dbzer0.com 9 points 8 hours ago (1 children)

Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

[–] Entertainmeonly@lemmy.blahaj.zone 1 points 43 minutes ago

That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

[–] some_guy@lemmy.sdf.org 36 points 13 hours ago (2 children)

For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

[–] Agent641@lemmy.world 5 points 10 hours ago

Punishment for an adult man doing this: Prison

Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

[–] lka1988@lemmy.dbzer0.com 5 points 11 hours ago* (last edited 11 hours ago) (2 children)

As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

[–] seralth@lemmy.world 5 points 6 hours ago (1 children)

There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

[–] lka1988@lemmy.dbzer0.com 4 points 4 hours ago (1 children)

ruining the life of a 13 year old boy for the rest of his life with no recourse

And what about the life of the girl this boy would have ruined?

This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

[–] Vinstaal0@feddit.nl 1 points 38 minutes ago

It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

[–] some_guy@lemmy.sdf.org 7 points 8 hours ago (2 children)

Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

[–] lka1988@lemmy.dbzer0.com 1 points 4 hours ago

I did say equitable punishment. Equivalent. Whatever.

A written apology is a cop-out for the damage this behaviour leaves behind.

Something tells me you don't have teenage daughters.

[–] MeThisGuy@feddit.nl 2 points 6 hours ago

written apology? they'll just use chatgpt for that

[–] Walk_blesseD@piefed.blahaj.zone 7 points 13 hours ago (1 children)

Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
And men (pretend to) wonder why we distrust them.

Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.

[–] atomicorange@lemmy.world 4 points 12 hours ago

Yeah there’s some nasty shit here. Big yikes, Lemmy.

[–] dinckelman@lemmy.world 116 points 23 hours ago (5 children)

Lawmakers are grappling with how to address ...

Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

[–] lka1988@lemmy.dbzer0.com 2 points 11 hours ago

In the case of US govt, the AI part of the bill they voted against was the part that blocked regulations on AI for a period of 10 years.

In case that wasn't clear, the US govt voted in favor of regulating AI. 99-1.

[–] Jax@sh.itjust.works 28 points 21 hours ago (4 children)

Oh I just assumed that every Conservative jerks off to kids

load more comments (4 replies)
[–] shalafi@lemmy.world 13 points 19 hours ago (1 children)

A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.

[–] LovableSidekick@lemmy.world 6 points 16 hours ago* (last edited 16 hours ago) (3 children)

Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn't been made into a meme yet.

load more comments (3 replies)
load more comments (2 replies)
[–] danciestlobster@lemmy.zip 7 points 15 hours ago (3 children)

I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

[–] kayzeekayzee@lemmy.blahaj.zone 3 points 7 hours ago (1 children)

I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

[–] gkpy@feddit.org 1 points 5 hours ago* (last edited 5 hours ago) (1 children)

Cheers for the explanation, had no idea that's how it works.

So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

[–] Vinstaal0@feddit.nl 1 points 33 minutes ago

You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

People have also put dots on people's clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.

[–] lime@feddit.nu 6 points 13 hours ago* (last edited 13 hours ago)

not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like "photograph"+"person"+"small"+"pose" and generate plausible material due to the fact that all of those concepts have features in common.

you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to "steer" the output of a model towards a particular style.

you can make even a fully legal model output illegal data.

all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it's like 12 billion images so it's hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.

load more comments (1 replies)
[–] Regrettable_incident@lemmy.world 8 points 16 hours ago (1 children)

Aren't there already laws against making child porn?

[–] cley_faye@lemmy.world 5 points 15 hours ago (1 children)

I'd rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.

Alas, whether there's a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.

[–] Vinstaal0@feddit.nl 1 points 29 minutes ago

There is also a difference between somebody harassing somebody with nude pictures (either real or not) than somebody jerking off to them at home. It does become a problem when an adult masturbated to pictures of children, but children to children. Let's be honest, they will do it anyway.

[–] RememberTheApollo_@lemmy.world 23 points 19 hours ago* (last edited 19 hours ago)

I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.

[–] electric_nan@lemmy.ml 16 points 19 hours ago (4 children)

My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

[–] lemmydividebyzero@reddthat.com 8 points 16 hours ago (1 children)

In the bible, it says, and I quote: "If a deepkfake of you is made, you shall give the creator more material to create deepfakes"

load more comments (1 replies)
load more comments (3 replies)
[–] wewbull@feddit.uk 42 points 23 hours ago (9 children)

Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

I would categorise it as sexual harassment, not abuse. Still serious, but a different level

[–] lath@lemmy.world 38 points 23 hours ago (15 children)

Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

load more comments (15 replies)
[–] LadyAutumn@lemmy.blahaj.zone 21 points 22 hours ago* (last edited 21 hours ago) (20 children)

Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

load more comments (20 replies)
load more comments (7 replies)
load more comments
view more: next ›