cross-posted from: lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
Submitted 5 months ago by BrikoX@lemmy.zip to technology@lemmy.zip
cross-posted from: lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
Sensitive topic - obviously.
However these guard rail laws, and “won’t someone think about the children” cases are a reeeeally easy way for the government to remove more power from the people.
However, I believe if handled correctly, banning this sort of thing is absolutely necessary to combat the mental illness that is pedophilia.
I don’t condone child sexual abuse, and I’m definitely not a pedo (gosh, I can’t believe I have to state this.)
But how does banning AI generated material help combating a mental illness? The mental illness will still be there, with or without images…
There’s something to be said about making it as difficult as possible to enable the behavior. Though that does run the risk of a particularly frustrated individual doing something despicable to an actual child. I don’t exactly have the data on how all this plays out, and frankly I don’t want to be the one to look into it. Society isn’t particularly equipped to handle an issue like this though, focusing on stigma alone to kinda try to shove it under the rug.
Mainly it’s a probably of enabling the problem as others have mentioned.
It’s not a solution, per se. It doesn’t solve something specifically- but it doesn’t have to be. It’s about making it less accessible, harsher consequences, and so on to put more pressure on not continuing to participate in the activity. Ultimately it boils down to mental health and trauma. Pedophilia is a paraphilic disorder at the end of the day.
13.000 images are generated relatively fast. My PC needs like 5 seconds for a picture with SD(depending on settings of course). So not even a day.
Also, if pedos would only create their own shit to fap to i would consider this a win.
The only good pedo is a pedo permanently separated from society. Let’s start with the Catholic Church
Also, if pedos would only create their own shit to fap to i would consider this a win.
Vape logic.
Could you explain why?
It seems weird that the AI companies aren't being held responsible too.
It’s open source code that someone ran on their own computer, it’s not like he used paid OpenAI credits to generate the image.
It also would set a bad precedent - it would be like charging Solomons & Fryhle because someone used their (absolutely ubiquitous) organic chemistry textbook to create methamphetamine
Well the American way is not to hold the company accountable, I.e. school shootings, so yeah.
I’m pretty sure you can’t hold a school liable for a school shooting
Just to be clear, you guys think that any company that produces anything that ends up used in a crime should have criminal charges for making the product? Yeah, makes about as much sense as anything these days.
Was Kodak ever held responsible for original CSAM?
I think stable diffusion is an open source AI you can run on your own computer, so I don’t see how the developers should be held responsible for that.
70 years for… Generating AI CSAM? So that’s apparently worse than actually raping multiple children?
He did more than generate it, he also sent some of it to a minor on Instagram, probably intending to get some real CSAM, or worse. For that, spending the next 70 years away from both children and computers seems appropriate to me.
Punishment over rehabilitation has always been a great solution 👍
The basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?
Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it’s just stick figures, and I’ve labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it’s CSAM?
It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it’s not any real person?
This seems like a very bad path to head down.
Simpson CSAM in 2008 in Australia:
Of course he did. That’s the world we live in.
Daxtron2@startrek.website 5 months ago
How are they abuse images if no abuse took place to create them?
sxt@lemmy.world 5 months ago
If the model was trained on csam then it is dependent on abuse
Darrell_Winfield@lemmy.world 5 months ago
That’s a heck of a slippery slope I just fell down.
If responses generated from AI can be held criminally liable for their training data’s crimes, we can all be held liable for all text responses from GPT, since it’s being trained on reddit data and likely has access to multiple instances of brigading, swatting, man hunts, etc.
Daxtron2@startrek.website 5 months ago
As I said in my other comment, the model does not have to be trained on CSAM to create images like this.
Jimmyeatsausage@lemmy.world 5 months ago
That irrelevant, any realistic depiction of children engaged in sexual activity meets the legal definition of csam. Even using filters on images of consenting adults could qualify as csam if the intent was to make the actors appear underage.
laughterlaughter@lemmy.world 5 months ago
I mean… regardless of your moral point of view, you should be able to answer that yourself. Here’s an analogy: suppose I draw a picture of a man murdering a dog. It’s an animal abuse image, even though no actual animal abuse took place.
Daxtron2@startrek.website 5 months ago
Its not though, its just a drawing.
PoliticalAgitator@lemmy.world 5 months ago
Because they are images of children being graphically raped, a form of abuse.
Daxtron2@startrek.website 5 months ago
No it isn’t, not anymore than a drawing of a car is a real car, or drawings of money are reap money.
Leg@lemmy.world 5 months ago
It’s a picture of a hallucination of a tree. Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.
NewDark@hexbear.net 5 months ago
While I can’t say for 100% certain, it would almost assuradly be trained on many abuse images in order to generate more.
Daxtron2@startrek.website 5 months ago
No that’s not necessarily true at all. You can take a stock stable diffusion model and output these images right Now without additional training. The whole point of diffusion models is that you don’t need 100% training data coverage to create new images outside of the original dataset. Having learned the concept of child from any normal image set of children and learning the concept of nudity/porn from legal adult images is more than enough to create a blended concept of the two.
possiblylinux127@lemmy.zip 5 months ago
CSAM is illegal all around
FluorideMind@lemmy.world 5 months ago
It isn’t csam if there was no abuse.
Reddfugee42@lemmy.world 5 months ago
We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”
I can only draw from this that your morality is based on laws instead of vice versa.
mindbleach@sh.itjust.works 5 months ago
There was no C.
There was no SA.
The entire point of saying “CSAM” was to distinguish evidence of child rape from depictions of imaginary events.
mindbleach@sh.itjust.works 5 months ago
All the lemmy.world commenters came out to insist “that painting is a pipe, though.”
Yeah? Smoke it.
Daxtron2@startrek.website 5 months ago
Lemmy.world and bandwagoning on a sensitive topic that they know nothing about? Classic combo.