In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’
In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.
So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.
At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.
There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.
Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.
Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.
Politics is about to get WILD
This is not going to work. Just because images and videos become less reliable that doesn't mean we will forget about the fact that eyewitness testimony is very unreliable.
This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.
Fucking horrible world we're forced to live in. Where's the fucking exit?
A bit off topic, but I wonder if the entertainment industry as a whole is going to be completely destroyed by AI when it gets good enough.
I can totally see myself prompting “a movie about love in the style of Star Wars, with Ryan Gosling and Audrey Hepburn as the leads, directed by Alfred Hitchcock, written by Vincent Hugo.” And then what? It’s game over for any content creation.
Curious if I’ll see that kind of power at home (using open source tools) in my lifetime.
Holy shit, I never thought of the whole witness testimony aspect. For some reason my mind was just like “well, nothing we see in videos or pictures is real anymore, guess everyone is just gonna devolve into believing whatever confirms their bias and argue endlessly about which pictures are fake and which are real.”
Witness testimony and live political interactions are going to become incredibly important for how our society views “the truth” in world events in the near future. I don’t know if I love or hate that.
Not necessarily, solutions can implemented. For example, footage from private security cameras can be sent to trusted establishment (trusted by the court at least) in real time which can be timestamped and stored (maybe not necessarily even stored there, encryption with timestamp may be enough). If source private camera and the network is secure, footage is also secure.
FTFY. Witness has never been that good a means to verify something is real.
Maybe there will be cameras as well that sign the pictures they take?
Thats why we need Blockchain Technology..
Check Blockchain Camera for example: https://github.com/sv1sjp/Blockchain_Camera
Abstract:
Same goes for any deepfake. People are loosing their shit because we won't know what's real and what's not!.
We should have been teaching critical thinking a generation ago. Sagan was pleading for reform in the 90s. We can start teaching the next generation how to navigate the Information Age. What we can't do is make the world childproof.
Yeah, what I see happening is people end up not caring as much because there's going to be so much plausible AI generated crap that any real stuff will be lost in the noise.
Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.
Fang mit dem relativ neuen Fall hier an, und von da solltest du dann genug Info haben um selber zu suchen was die letzten Jahre passiert ist - das ist exakt das wovor damals gewarnt wurde, aber wer den hysterischen Irren die alles was irgendwie mit "Teenager entdecken Sexualitaet" mit dem Strafrecht erschlagen wollen mit durchdachten Argumenten kommt ist dann ja direkt auch ein Paedophiler.
https://www.swr.de/swraktuell/rheinland-pfalz/koblenz/lehrerin-kinderpornografischer-inhalte-konfisziert-deswegen-angeklagt-100.html