The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material in 2025, with “the vast majority” stemming from Amazon.
but isn’t saying where it came from
Isn’t that already grounds for legal punishment? This shit really shouldn’t fly
Prove_your_argument@piefed.social 22 hours ago
Amazon Photos syncing, if I had to guess. It was marketed a free unlimited backup for amazon prime users.
AmbitiousProcess@piefed.social 22 hours ago
Yep. They are allowed to use your photos to “improve the service,” which AI training would totally qualify under in terms of legality. No notice to you required if they rip your entire album of family photos so an AI model can get 0.00000000001% better at generating pictures of fake family photos.
ImgurRefugee114@reddthat.com 22 hours ago
Unlikely IMO. Maybe some… But if they scraped social media sites like blogs, Facebook, or Twitter, they would end up with dumptrucks full. Ask any one who has to deal with UGC: it pollutes every corner of the net and it’s damn near everywhere. The proliferation of local models capable of generating photorealistic materials has only made the situation worse. It was rare to uncover actionable cases before, but the signal to noise ratio is garbage now.
ZoteTheMighty@lemmy.zip 20 hours ago
But if they’re uniquely good at producing CSAM, odds are it’s due to a proprietary dataset.
ColeSloth@discuss.tchncs.de 18 hours ago
They wouldn’t be bothered to try and hide that they were pulled from those public services.
They 100% know that if they revealed that they used everyone’s private photos backed up to Amazon cloud as fodder for their AI that it would puss people off and they’d lose some business out of the deal.
captainlezbian@lemmy.world 17 hours ago
Yeah my bet is Facebook and maybe some less reputable sites. Surely they didn’t scrape 8chan right?