‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!
I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.
These are terrible but I'm honestly curious what it thinks I look like naked. Like I'm slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?
Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.
Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities.
Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all
It tells me we're less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress' nudity is real or simulated.
Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren't measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.
In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.
Porn doesn't bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.
I use an ad blocker and haven't seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?
Reminds me of Arthur C Clarke's The Light of Other Days. There's a technology in the book that allows anyone to see anything, anywhere, which eliminates all privacy. Society collectively adjusts, e.g. people masturbate on park benches because who gives a shit, people can tune in to watch me shower anyway.
Although not to the same extreme, I wonder if this could similarly desensitize people: even if it's fake, if you can effectively see anyone naked... what does that do to our collective beliefs and feelings about nakedness?
It was inevitable. And it tells more about those who use them.
I wonder how we'd adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won't be seen as a trusted source of information, they won't be any unique worth hunting for, or being worried about.
Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent 'filter-apps', but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?
There're some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it's wsy. Who knows?
I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what're the long term consequencies for us?
Back in the day, cereal boxes contain "xray glasses". I feel like if those actually worked as intended, we would have already had this issue figured out.
It would be interesting to know how many people are using it for themselves. I'd think it would open up next level catfishing. Here's an actual pic of me, and here's a pic of what I might look like naked. I'm sure some people with photoshop skills we're already doing that to a certain extent, but now it's accessible to everyone.
That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.
The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.