PoliticalAgitator
@PoliticalAgitator@lemmy.world
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Take your pills.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
You’re getting even less plausible but still desperately clinging to flawed rhetoric that only benefits paedophiles.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
I understood it just fine.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Images of children being raped are being treated as images of children being raped. Nobody has every been caught with child pornography and charged as if they abused the children themselves. Everything is being treated as it always has been, but you’re here arguing that it’s moral and harmless as long as an AI does it, using every semantic trick and shifted goalpost you possibly can.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
You’ve already fucked up your own argument. You’re supposed to be insisting there’s no such thing as a “violent video game”, because representations of violence don’t count, only violence done to actual people.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Oh yeah, opposition to videos showing the graphic rape of children is all just a big tech conspiracy. Fuck off scumbag.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
I’m not engaging for your benefit, which is why I’ve got no interest in repeating the same point in 500 ways in the hope it sinks in. But the reality is that a lot of people get their opinions from social media and they sure as fuck shouldn’t imitate your views on CSAM so it’s important that nobody mistakes contrarianism and apologism for actual wisdom.
But yes, it is hard to stand by while you lie your little heart in a way that helps paedophiles. I’m not ashamed or embarrassed about that.
So here’s how it will play out: Your bullshit apologism and enabling will result in the creation of platforms for circulating child pornography. This platform will immediately be flooded with pictures and videos of children being raped that are indistinguishable from “genuine” child pornography, thanks to models being trained on paedophiles back catalogue.
As the amount of content grows, more and more videos of actual children being raped will enter circulation, with moderators and paedophile wriggling out of it by claiming “I thought it was AI generated”.
New videos featuring the rape of actual children will be created and posted to these communities as child pornography normalises the abuse of children for the members. Detection and prosecution of the people responsible being functionally impossible because they’ve been buried and obfuscated by the AI generated content you insist doesn’t count.
But hey, at least your bullshit semantic sensibilities haven’t been offended right? That seems way more important to you than the abuse of children anyway.
We’re not talking about “drawings of children being raped that make people uncomfortable”. We’re talking about pictures and videos that are indistinguishable from reality, featuring children being coereced or forced into performing every act and fetish known to pornography.
And you fucking know it.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
If Paedophile Hill is the hill you want to die on, it’s no loss to me.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Bullshit and you know it.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Material showing a child being sexually abused is child sexual abuse material.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
The thread is about “how are they abuse images if no abuse took place” and the answer is “because they’re images of abuse”. I haven’t claimed they’re real at any point.
It’s not a thought crime because it’s not a thought. Nobody is being charged for thinking about raping children, they’re being charged for creating images of children being raped.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
It’s a picture of a hallucination of a tree
So yes, it’s a tree. It’s a tree that might not exist, but it’s still a picture of a tree.
You can’t have an image of a child being raped – regardless of if that child exists or not – that is not CSAM because it’s an image of a child being sexually abused.
Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.
Okay, so who are you volunteering to go through an endless stream of images and videos of children being raped to verify that each one has been generated by an AI and not a scumbag with a camera? Peados?
Why are neckbeards so enthusiastic about dying on this hill? They seem more upset that there’s something they’re not allowed to jerk off to than by the actual abuse of children.
Functionally, legalising AI generated CSAM means legalising “genuine” CSAM because it will be impossible to distinguish the two, especially as paedophiles dump their pre-AI collections or feed them in as training data.
People who do this are reprehensible, no matter what hair splitting and semantic gymnastics they employ.
- Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges 5 months ago:
Because they are images of children being graphically raped, a form of abuse.
- Comment on Spotify just hid song lyrics behind its subscription 6 months ago:
Spotify should be happy they get a song for fractions of a cent. Without the artists, they’d be nothing.