Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
NewDark@hexbear.net 5 months agoWhile I can’t say for 100% certain, it would almost assuradly be trained on many abuse images in order to generate more.
Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
NewDark@hexbear.net 5 months agoWhile I can’t say for 100% certain, it would almost assuradly be trained on many abuse images in order to generate more.
Daxtron2@startrek.website 5 months ago
No that’s not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don’t need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.
zaph@sh.itjust.works 5 months ago
Those children are the abuse victims.
Bananigans@lemmy.dbzer0.com 5 months ago
Having never used an AI generator, generic generated images wouldn’t be an actual match to the dataset images, right? It would just be generating features it understands to be associated with the concept of a child, which would make the claim that the dataset children are the abuse targets a stretch, unless there’s some other direct or indirect harm to them. An immediate exception being a person writing a prompt attempting to create a specific facsimile of an individual.
zaph@sh.itjust.works 5 months ago
That’s nice, still illegal.
Daxtron2@startrek.website 5 months ago
The children don’t exist. The concept of child is learned from a series of 1s and 0s
zaph@sh.itjust.works 5 months ago
Then it’s abuse because the law says so and how the images are generated is irrelevant.