I mean… regardless of your moral point of view, you should be able to answer that yourself. Here’s an analogy: suppose I draw a picture of a man murdering a dog. It’s an animal abuse image, even though no actual animal abuse took place.
Comment on US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges
Daxtron2@startrek.website 5 months ago
How are they abuse images if no abuse took place to create them?
laughterlaughter@lemmy.world 5 months ago
Daxtron2@startrek.website 5 months ago
Its not though, its just a drawing.
laughterlaughter@lemmy.world 5 months ago
Except that it is an animal abuse image, drawing, painting, fiddle, whatever you want to call it. It’s still the depiction of animal abuse.
Same with child abuse, rape, torture, killing or beating.
Now, I know what you mean by your question. You’re trying to establish that the image/drawing/painting/scribble is harmless because no actual living being suffering happened. But that doesn’t mean that they don’t depict it.
Again, I’m seeing this from a very practical point of view. However you see these images through the lens of your own morals or points of view, that’s a totally different thing.
Daxtron2@startrek.website 5 months ago
And when characters are killed on screen in movies, are those snuff films?
PoliticalAgitator@lemmy.world 5 months ago
Because they are images of children being graphically raped, a form of abuse.
Daxtron2@startrek.website 5 months ago
No it isn’t, not anymore than a drawing of a car is a real car, or drawings of money are reap money.
PoliticalAgitator@lemmy.world 5 months ago
Material showing a child being sexually abused is child sexual abuse material.
Daxtron2@startrek.website 5 months ago
And an AI generated image does not show a child being abused
h3mlocke@lemm.ee 5 months ago
Oops you forgot to use logic. As per the comment you’re replying to, the more apt analogy would be: is an AI generated picture of a car still a picture of a car.
Daxtron2@startrek.website 5 months ago
That has nothing to do with logic? Its pointing out that both drawings and AI gens are not really the things they might depict
laughterlaughter@lemmy.world 5 months ago
Nobody is saying they’re real, and I now see what you’re saying.
By your answers, your question is more “at-face-value” than people assume:
You are asking:
“Did violence occur in real life in order to produce this violent picture?”
The answer is, of course, no.
But people are interpreting it as:
“This is a picture of a man being stoned to death. Is this picture violent, if no violence took place in real life?”
To which answer is, yes.
Daxtron2@startrek.website 5 months ago
It can be abhorrent and unlikable, its still not abuse
Leg@lemmy.world 5 months ago
It’s a picture of a hallucination of a tree. Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.
PoliticalAgitator@lemmy.world 5 months ago
It’s a picture of a hallucination of a tree
So yes, it’s a tree. It’s a tree that might not exist, but it’s still a picture of a tree.
You can’t have an image of a child being raped – regardless of if that child exists or not – that is not CSAM because it’s an image of a child being sexually abused.
Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.
Okay, so who are you volunteering to go through an endless stream of images and videos of children being raped to verify that each one has been generated by an AI and not a scumbag with a camera? Peados?
Why are neckbeards so enthusiastic about dying on this hill? They seem more upset that there’s something they’re not allowed to jerk off to than by the actual abuse of children.
Functionally, legalising AI generated CSAM means legalising “genuine” CSAM because it will be impossible to distinguish the two, especially as paedophiles dump their pre-AI collections or feed them in as training data.
People who do this are reprehensible, no matter what hair splitting and semantic gymnastics they employ.
Leg@lemmy.world 5 months ago
Hey man, I’m not the one. I’m literally just saying that the images that AI creates are not real. If you’re going to argue that they are, you’re simply wrong. Should these ones be generated? Obviously I’d prefer that they not be. But they’re still effectively fabrications that I’m better off simply not knowing about.
If you want to get into the weeds and discuss the logistics of enforcing what is essentially thought crime, that is a different discussion I’m frankly not savvy enough to have here. I have no control over the ultimate outcome, but for what it’s worth, my money says thought crime will in fact become a punishable offense within our lifetimes, and this may well be an easy catalyst to use to that end. This should put your mind at ease.
NewDark@hexbear.net 5 months ago
While I can’t say for 100% certain, it would almost assuradly be trained on many abuse images in order to generate more.
Daxtron2@startrek.website 5 months ago
No that’s not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don’t need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.
zaph@sh.itjust.works 5 months ago
Having learned the concept of child from any normal image set of children
Those children are the abuse victims.
Bananigans@lemmy.dbzer0.com 5 months ago
Having never used an AI generator, generic generated images wouldn’t be an actual match to the dataset images, right? It would just be generating features it understands to be associated with the concept of a child, which would make the claim that the dataset children are the abuse targets a stretch, unless there’s some other direct or indirect harm to them. An immediate exception being a person writing a prompt attempting to create a specific facsimile of an individual.
Daxtron2@startrek.website 5 months ago
The children don’t exist. The concept of child is learned from a series of 1s and 0s
possiblylinux127@lemmy.zip 5 months ago
CSAM is illegal all around
FluorideMind@lemmy.world 5 months ago
It isn’t csam if there was no abuse.
Jimmyeatsausage@lemmy.world 5 months ago
It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.
possiblylinux127@lemmy.zip 5 months ago
Yes it is
CSAM is child pornography
JackGreenEarth@lemm.ee 5 months ago
Do you not know that CSAM is an acronym that stands for child sexual abuse material?
ASeriesOfPoorChoices@lemmy.world 5 months ago
in this instance, no human children or minors of any kind were involved.
Reddfugee42@lemmy.world 5 months ago
We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”
I can only draw from this that your morality is based on laws instead of vice versa.
possiblylinux127@lemmy.zip 5 months ago
I’m in the camp if that there is no reason that you should have that kind of imagery especially AI generated imagery. Think about what people often do with pornography. You do not want them doing that with children regardless of if it is AI generated.
Reddfugee42@lemmy.world 5 months ago
What does want have to do with it? I’d rather trust science and psychologists to determine if this, which is objectively harmless, helps them control their feelings and gives them a harmless outlet.
mindbleach@sh.itjust.works 5 months ago
There was no C.
There was no SA.
The entire point of saying “CSAM” was to distinguish evidence of child rape from depictions of imaginary events.
mindbleach@sh.itjust.works 5 months ago
All the lemmy.world commenters came out to insist “that painting is a pipe, though.”
Yeah? Smoke it.
Daxtron2@startrek.website 5 months ago
Lemmy.world and bandwagoning on a sensitive topic that they know nothing about? Classic combo.
mindbleach@sh.itjust.works 5 months ago
You’d figure “CSAM” was clear enough. You’d really figure. But apparently we could specify “PECR” for “photographic evidence of child rape” and people would still insist “he drew PECR!” Nope. Can’t. Try again.
Daxtron2@startrek.website 5 months ago
Ever moving goal posts. Ever notice how the ones who cry “for the children” the most seemingly have the most to hide?
sxt@lemmy.world 5 months ago
If the model was trained on csam then it is dependent on abuse
Darrell_Winfield@lemmy.world 5 months ago
That’s a heck of a slippery slope I just fell down.
If responses generated from AI can be held criminally liable for their training data’s crimes, we can all be held liable for all text responses from GPT, since it’s being trained on reddit data and likely has access to multiple instances of brigading, swatting, man hunts, etc.
laughterlaughter@lemmy.world 5 months ago
You just summarized the ongoing ethical concerns experts and common folk alike have been talking about in the past few years.
Daxtron2@startrek.website 5 months ago
As I said in my other comment, the model does not have to be trained on CSAM to create images like this.
Jimmyeatsausage@lemmy.world 5 months ago
That irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.
ASeriesOfPoorChoices@lemmy.world 5 months ago
doesn’t even have to be that realistic.
…com.au/…/bizarre-australian-criminal-cases-the-s…