Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Jimmyeatsausage@lemmy.world 6 months agoThat irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.
ASeriesOfPoorChoices@lemmy.world 6 months ago
doesn’t even have to be that realistic.
…com.au/…/bizarre-australian-criminal-cases-the-s…