Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges

<- View Parent
Jimmyeatsausage@lemmy.world ⁨1⁩ ⁨month⁩ ago

That irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.

source
Sort:hotnewtop