His record should be expunged when he turns 18 because it was a crime he committed as a child. I understand their frustrations, but they're asking to jail a child over some photoshopped images.
Making a deepfake is definitely not a heavy crime that deserves jailtime or a permanent mark unless he was an adult doing it.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.
There's a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”
This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.
“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”
BS
It's been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.
Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it's extra illegal.
Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat's rules and would have been taken down:
We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
I apologize for the innappropriate behavior and bans by @TheAnonymouseJoker@lemmy.ml in this thread, I've removed them as a mod here, banned them, and unbanned the ppl who they innappropriately banned.
Note: if they get unbanned in the near future, its because of our consensus procedure which requires us admins to take a vote.
Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it's over the Internet it would bring Federal charges even though there maybe State charges
Somethings were handled wrong if all the kid is getting is probation
The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.
Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said.
The original article contains 585 words, the summary contains 205 words. Saved 65%. I'm a bot and I'm open source!
society has become so used to girls and women being considered less that there is a scary amount of rationalization as to why its fine actually to completely annihilate all remaining bodily autonomy they have left. this is an explosion in suicides of young girls and adult women alike begging to happen. wake the fuck up.
Such actions should be judged not as CSAM but as defamation and libel. Anyone going around harping about AI CSAM does not care about empowering politicians and elites and will bootlick them forever happily. A drawing or AI generated media cannot be CSAM, because nobody is physically abused.
That's all well and good to remove them, but it solves nothing. At this point every easily accessible AI I'm aware of is kicking back any prompts with the names of real life people, they're already antisipating real laws, preventing the images from being made in the first place isn't impossible.