Jimmyeatsausage
@Jimmyeatsausage@lemmy.world
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 6 months ago:
That irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 6 months ago:
It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.