Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Daxtron2@startrek.website 5 months agoNo that’s not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don’t need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.
zaph@sh.itjust.works 5 months ago
Those children are the abuse victims.
Bananigans@lemmy.dbzer0.com 5 months ago
Having never used an AI generator, generic generated images wouldn’t be an actual match to the dataset images, right? It would just be generating features it understands to be associated with the concept of a child, which would make the claim that the dataset children are the abuse targets a stretch, unless there’s some other direct or indirect harm to them. An immediate exception being a person writing a prompt attempting to create a specific facsimile of an individual.
zaph@sh.itjust.works 5 months ago
That’s nice, still illegal.
Bananigans@lemmy.dbzer0.com 5 months ago
While true, it also wasn’t really what my post was responding to. Thanks though.
Daxtron2@startrek.website 5 months ago
The children don’t exist. The concept of child is learned from a series of 1s and 0s
zaph@sh.itjust.works 5 months ago
Then it’s abuse because the law says so and how the images are generated is irrelevant.
Daxtron2@startrek.website 5 months ago
Believe it or not, US law isn’t the moral center of the universe not to mention its disproportionate use against groups it doesn’t like.