Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
laughterlaughter@lemmy.world 1 year agoNobody is saying they’re real, and I now see what you’re saying.
By your answers, your question is more “at-face-value” than people assume:
You are asking:
“Did violence occur in real life in order to produce this violent picture?”
The answer is, of course, no.
But people are interpreting it as:
“This is a picture of a man being stoned to death. Is this picture violent, if no violence took place in real life?”
To which answer is, yes.
Daxtron2@startrek.website 1 year ago
It can be abhorrent and unlikable, its still not abuse
laughterlaughter@lemmy.world 1 year ago
We’re not disagreeing.
The question was:
“Is this an abuse image if it was generated?”
Yes, it is an abuse image.
Is it actual abuse? Of course not.
Daxtron2@startrek.website 1 year ago
And yet its being treated as though it is
laughterlaughter@lemmy.world 1 year ago
Well, that’s another story. I just answered your question. “Are these images about abuse even if they’re generated?” Yup, they are.
“Should people be prosecuted because of them?” Welp, someone with more expertise should answer this. Not me.
PoliticalAgitator@lemmy.world 1 year ago
Images of children being raped are being treated as images of children being raped. Nobody has every been caught with child pornography and charged as if they abused the children themselves. Everything is being treated as it always has been, but you’re here arguing that it’s moral and harmless as long as an AI does it, using every semantic trick and shifted goalpost you possibly can.
h3mlocke@lemm.ee 1 year ago
No genius it’s just promoting abuse. Have a good day.
Daxtron2@startrek.website 1 year ago
Just like violent video games produce school shooters
PoliticalAgitator@lemmy.world 1 year ago
You’ve already fucked up your own argument. You’re supposed to be insisting there’s no such thing as a “violent video game”, because representations of violence don’t count, only violence done to actual people.