“The school shooter played Doom on his IBM Personal Computer” vibe all over again.
It’s abuse if there is no one being abused?
As described in the article, there is a no tolerance policy because of the danger of switching from generated to real. Children are not sexual objects. Whoever feels different needs help.
Initial training picture may come from abused child’s. And we can theorize that they will need more to keep training the AI.
You’re thinking
OpenAI(EDIT: Runway, sorry, got mixed up with ChatGPT) put CSEM in their data set? Or maybe an accident?I may be mistaken but it look more like someone used a pretrained image generator and retrained it with child pics. The following site for exemple teach you to build and train a image generator you just have to change the kind of pictures it is trained on to make it malevolent. https://www.assemblyai.com/blog/minimagen-build-your-own-imagen-text-to-image-model/
Installed this. Made no changes to it. Ran a few queries. Most was nightmare fuel of severed limbs and crazy teeth etc as I have no clue what I’m doing. But still, with enough tries… it generated it. So, confirmed that you don’t need to introduce CP to get naked AI kids.
What’s even legal in britbong land? I think they tried to outlaw anything equally or more kinky as “girl on top” for a hot minute before there was a big backlash. I have no idea what their “obscenity” laws are now.
I sawed the daemons: they’re over on Pawoo constantly sharing their fucking CP. I guess it’s easier to go after artists than catch criminals harming actual children. Had more to say, but it turned into a rant.
Could AI-Porn not be something that makes actual CP go extinct? I have the feeling that it is not at all about protecting children but simply about hating on the “perverts”.