in this instance, no human children or minors of any kind were involved.
Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
possiblylinux127@lemmy.zip 6 months agoYes it is
CSAM is child pornography
ASeriesOfPoorChoices@lemmy.world 6 months ago
possiblylinux127@lemmy.zip 6 months ago
I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.
ASeriesOfPoorChoices@lemmy.world 6 months ago
prove that any “training” is involved, please.
JackGreenEarth@lemm.ee 6 months ago
Do you not know that CSAM is an acronym that stands for child sexual abuse material?
possiblylinux127@lemmy.zip 6 months ago
True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.