Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
-
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data -
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data@GossiTheDog Local models are getting good enough now (and uncensored) to make this trivial even for the inept pervert. Pandora's personal paedophillia producers' box is already open sadly.
-
@GossiTheDog Local models are getting good enough now (and uncensored) to make this trivial even for the inept pervert. Pandora's personal paedophillia producers' box is already open sadly.
@scottgal that doesn't mean using child sexual abuse material images to train AI is okay.
-
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data@GossiTheDog this sounds pretty unbelievable tbh. LAION having "thousands" was a big public thing forcing re-release of the dataset. Others just piling on after this was discovered with no detection algorithms having been used??
Amazon should really publish this information.
-
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data@GossiTheDog reminder that recursive pollution remains a HUGE open problem with ML models.
https://berryvilleiml.com/2026/01/10/recursive-pollution-and-model-collapse-are-not-the-same/
-
@scottgal that doesn't mean using child sexual abuse material images to train AI is okay.
@GossiTheDog @scottgal they say they're not training on it, it was detected before training. But that's not the point. Amazon got the stuff from somewhere, and a decent person would report where it came from so that the rozzers can trace it back upstream. I flat out don't believe Amazon's claim to not know where it came from, they must know, because they must have got copyright clearance for making a derivative work from all that content

-
Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data@GossiTheDog Can’t read the article so this is speculation: Amazon admitted having lots of CSAM but refuses to tell where they downloaded from? I thought holding on to CSAM is a crime in self, but as usual rules do not apply to big tech. And where did the material came from? Secret access to customer data they refuse to disclose?
-
B bernd@mstdn.bernd.network shared this topic