That mascot is a child. Please don't make porn of fictional children even if you disagree with the politics or religion of that fictional child you're making porn out of. Child porn is a bad thing
Googling “Luce porn” immediately surfaced sites where people are posting what appears to be handmade images of exactly that (click at your own risk), but searching Civitai specifically, a site which makes AI-generating porn of anyone and anything extremely easy, turned up the real motherlode.
I gotta say, the choice to write this article itself is questionable. It's obvious this stuff would exist to anyone familiar with the internet, but describing the images in lurid detail and saying exactly how to get them complete with direct links, kinda sus.
Remember kids! Jesus was best friends with a prostitute. This is in fact something God and Jesus would approve of....probably. If she's the kind of lady who's down with Jesus and only does stuff with other ladies when a guy is also involved, then I think Jesus and God would be cool with her.
While none of these models were explicitly tagged NSFW that I noticed, and don't bill themselves with NSFW images, Pony Diffusion is trained on NSFW images and thus brings that knowledge to derived models. It can understand Danbooru tags, which include a fair number of sexual acts, sexual poses, fetish clothing, sex toys, and suchlike.
I'd also assume that if Luce catches on, general models not specifically intended for generating images of Luce will also learn what Luce looks like as images containing Luce make their way into their training corpuses; that would presumably include NSFW general models.
EDIT: No, I take it back. This Pony-Diffusion-based model, this, and this one do use NSFW images in their example image list if one cycles through the whole list of images. It may have been hidden before, as I was checking the site anonymously, and I believe the civitai default is to hide NSFW content. Also, this model has some suspiciously well-endowed Luce images in its example image list, and this model doesn't have nudity in its example images, but does have Luce flipping the viewer the bird in an example image, which I imagine is probably sacrilegious.
So is the image in the article the porn version? Because that's hella tame. Oh is that just something they've commissioned which sort of gets the point across?
Now I want the rule 34 version of the story of John Henry. Like, the next pokemon trailer comes out, and one NSFW artist has to put out porn of the new professor before the AI can generate it.
falsely assumed you want to ban AI generated cartoon csam.
I agree with some of the things you commented on. Some I don't.
One thing I hope we both can agree on is that there should be more research done. We have to understand the nuances before we start calling someone a bad or disgusting person.