Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges

<- View Parent
Daxtron2@startrek.website ⁨1⁩ ⁨month⁩ ago

No that’s not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don’t need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.

source
Sort:hotnewtop