A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn't it be better that these people get their fix from AI created media than from the real thing?
IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.
One thing I have to ask for those that say pedos should seek psychological/psychiatric treatment: do you even know a professional that won't immediately call the cops if you say "i have sexual desires for kids"?
I wholly agree that this is something that should receive some form of treatment, but first the ones afflicted would have to know that they won't be judged, labeled and exposed when they do so.
Child porn one of those things that won't go away if you prohibit it, like alcohol. It'll just go underground and cause harm to real children.
AI child pornography images, as disturbing as they might be, would serve a "need", if you will, while not actually harming children. Since child pornography doesn't appear to be one of those "try it and you'll get addicted" things, I'm genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.
So this does bring up an interesting point that I haven't thought about - is it the depiction that matters, or is it the actual potential for victims that matters?
Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.
Considering every other aspect of this is being argued in this thread to exhaustion, I just want to say it's wild they caught him since it says he didn't distribute it.
(Apologies if I use the wrong terminology here, I'm not an AI expert, just have a fact to share)
The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren't responsible for what it is.
Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?
My god there are way too many comments in here trying to normalize pedophilia. Disgusting. Pathetic.
These are people that need serious psychiatric care, not acceptance or to be included in the LGBTQ+ community. There is absolutely nothing to compare between them and any group within the LGBTQ+ community. Nothing.
Combatting CP is a hard enough task for the poor bastards that have to do it. There does not need to be AI produced images in the mix.
And by the way, kudos to fediverse instances, you do a crazy job. That's the only good thing of this AI techno, detecting such crap and obliterate it. I don't care about false positive. if there's a false positive, OP could still try to defend their case if necessary.