Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Bananigans@lemmy.dbzer0.com 5 months agoHaving never used an AI generator, generic generated images wouldn’t be an actual match to the dataset images, right? It would just be generating features it understands to be associated with the concept of a child, which would make the claim that the dataset children are the abuse targets a stretch, unless there’s some other direct or indirect harm to them. An immediate exception being a person writing a prompt attempting to create a specific facsimile of an individual.
zaph@sh.itjust.works 5 months ago
That’s nice, still illegal.
Bananigans@lemmy.dbzer0.com 5 months ago
While true, it also wasn’t really what my post was responding to. Thanks though.
zaph@sh.itjust.works 5 months ago
The thread is discussing why it’s considered abuse if you can’t point to a victim. The answer turned out to be “because the law says so.”
Bananigans@lemmy.dbzer0.com 5 months ago
If you read the law you posted, it doesn’t actually address the question of victimhood. Also, I don’t really get why you’re still trying to force an unrelated point into this part of the discussion. Maybe find another place in the thread where someone thinks it’s legal and go talk to them.