We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”
I can only draw from this that your morality is based on laws instead of vice versa.
Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
possiblylinux127@lemmy.zip 6 months agoCSAM is illegal all around
We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”
I can only draw from this that your morality is based on laws instead of vice versa.
I’m in the camp if that there is no reason that you should have that kind of imagery especially AI generated imagery. Think about what people often do with pornography. You do not want them doing that with children regardless of if it is AI generated.
What does want have to do with it? I’d rather trust science and psychologists to determine if this, which is objectively harmless, helps them control their feelings and gives them a harmless outlet.
They aren’t banning porn in general. They just don’t want to create any more sexual desires toward children. The CSAM laws came from child protection experts. Admittedly some of these people want to “ban” encryption but that’s irrelevant in this case.
There was no C.
There was no SA.
The entire point of saying “CSAM” was to distinguish evidence of child rape from depictions of imaginary events.
FluorideMind@lemmy.world 6 months ago
It isn’t csam if there was no abuse.
Jimmyeatsausage@lemmy.world 6 months ago
It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.
possiblylinux127@lemmy.zip 6 months ago
Yes it is
CSAM is child pornography
JackGreenEarth@lemm.ee 6 months ago
Do you not know that CSAM is an acronym that stands for child sexual abuse material?
possiblylinux127@lemmy.zip 6 months ago
True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.
ASeriesOfPoorChoices@lemmy.world 6 months ago
in this instance, no human children or minors of any kind were involved.
possiblylinux127@lemmy.zip 6 months ago
I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.