Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Daxtron2@startrek.website 6 months agoThere is no child being abused by a generated image or a drawing.
Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Daxtron2@startrek.website 6 months agoThere is no child being abused by a generated image or a drawing.
PoliticalAgitator@lemmy.world 6 months ago
If Paedophile Hill is the hill you want to die on, it’s no loss to me.
Daxtron2@startrek.website 6 months ago
And yet you still engaged with it
h3mlocke@lemm.ee 6 months ago
Seems alot like your just promoting CSAM at that point.
Im sure that will have absolutely no effect on the pedophiles who are attracted to children acting on their desires. /s
Daxtron2@startrek.website 6 months ago
Do you have a reading disability?
PoliticalAgitator@lemmy.world 6 months ago
I’m not engaging for your benefit, which is why I’ve got no interest in repeating the same point in 500 ways in the hope it sinks in. But the reality is that a lot of people get their opinions from social media and they sure as fuck shouldn’t imitate your views on CSAM so it’s important that nobody mistakes contrarianism and apologism for actual wisdom.
But yes, it is hard to stand by while you lie your little heart in a way that helps paedophiles. I’m not ashamed or embarrassed about that.
So here’s how it will play out: Your bullshit apologism and enabling will result in the creation of platforms for circulating child pornography. This platform will immediately be flooded with pictures and videos of children being raped that are indistinguishable from “genuine” child pornography, thanks to models being trained on paedophiles back catalogue.
As the amount of content grows, more and more videos of actual children being raped will enter circulation, with moderators and paedophile wriggling out of it by claiming “I thought it was AI generated”.
New videos featuring the rape of actual children will be created and posted to these communities as child pornography normalises the abuse of children for the members. Detection and prosecution of the people responsible being functionally impossible because they’ve been buried and obfuscated by the AI generated content you insist doesn’t count.
But hey, at least your bullshit semantic sensibilities haven’t been offended right? That seems way more important to you than the abuse of children anyway.
We’re not talking about “drawings of children being raped that make people uncomfortable”. We’re talking about pictures and videos that are indistinguishable from reality, featuring children being coereced or forced into performing every act and fetish known to pornography.
And you fucking know it.
Daxtron2@startrek.website 6 months ago
There are no pictures or videos generated by AI that are indistinguishable from real CSAM because real CSAM requires a child to be abused to create.
All of those things you mention already happen all over social media.