Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges

<- View Parent
PoliticalAgitator@lemmy.world ⁨4⁩ ⁨months⁩ ago

It’s a picture of a hallucination of a tree

So yes, it’s a tree. It’s a tree that might not exist, but it’s still a picture of a tree.

You can’t have an image of a child being raped – regardless of if that child exists or not – that is not CSAM because it’s an image of a child being sexually abused.

Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

Okay, so who are you volunteering to go through an endless stream of images and videos of children being raped to verify that each one has been generated by an AI and not a scumbag with a camera? Peados?

Why are neckbeards so enthusiastic about dying on this hill? They seem more upset that there’s something they’re not allowed to jerk off to than by the actual abuse of children.

Functionally, legalising AI generated CSAM means legalising “genuine” CSAM because it will be impossible to distinguish the two, especially as paedophiles dump their pre-AI collections or feed them in as training data.

People who do this are reprehensible, no matter what hair splitting and semantic gymnastics they employ.

source
Sort:hotnewtop