- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.
Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?
Neither. I would have mental health supports that are accessible to them.
Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.
Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.
It’s (rightfully) currently illegal, but that doesn’t stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.
Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.
I’m not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don’t have data to back that).
I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I’d much rather help people be productive, non-violent members of society than lock them up, if given a choice.