There will be a lot of medical literature with photos of children’s bodies to demonstrate conditions, illnesses, etc.
smeg@infosec.pub 23 hours ago
All of the AI tools know how to make CP somehow - probably because their creators fed it to them.
stoly@lemmy.world 18 hours ago
phoenixz@lemmy.ca 18 hours ago
Yeah, press X to doubt that AI is generating child pornography from medical literature.
These fuckers have fed AI anything and everything to train them. They’ve stolen everything they could without repercussions, I wouldn’t be surprised if some of them fed their AIs child porn because “data is data” or something like that.
vaultdweller013@sh.itjust.works 16 hours ago
Depending on how they scraped data they may have just let their rovers run wild. Eventually they wouldve ran into child porn, which is also yet another reason why this tech is utterly shit. If you can’t control your tech you shouldn’t have it and frankly speaking curation is a major portion of any data processing.
Grimy@lemmy.world 21 hours ago
If it knows what children looks like and knows what sex looks like, it can extrapolate. That being said, I think all photos of children should be removed from the datasets, regardless of the sexual content.
Rooster326@programming.dev 21 hours ago
Obligatory it doesn’t “know” what anything looks like.
Grimy@lemmy.world 21 hours ago
Thank you, I almost forgot. I was busy explaining to someone else how their phone isn’t actually smart.