this post was submitted on 02 Jul 2025
378 points (97.5% liked)

Technology

72686 readers
1880 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

you are viewing a single comment's thread
view the rest of the comments
[–] wewbull@feddit.uk 54 points 1 week ago (69 children)

Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

I would categorise it as sexual harassment, not abuse. Still serious, but a different level

[–] LadyAutumn@lemmy.blahaj.zone 24 points 1 week ago* (last edited 1 week ago) (44 children)

Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

[–] General_Effort@lemmy.world 7 points 1 week ago (4 children)

Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

The implicit message here is simply harmful to girls and women.

That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

[–] lka1988@lemmy.dbzer0.com 1 points 1 week ago (1 children)

Spoken like someone who hasn't been around women.

[–] General_Effort@lemmy.world 1 points 1 week ago

You mean like a nerd who reads too much?

load more comments (2 replies)
load more comments (41 replies)
load more comments (65 replies)