Twitter Acts Fast on Nonconsensual Nudity If It Thinks It’s a Copyright Violation
Twitter Acts Fast on Nonconsensual Nudity If It Thinks It’s a Copyright Violation
![](https://lemmy.world/pictrs/image/f1cb7b0a-ec63-4230-9f56-78c17462e3ec.jpeg?format=webp&thumbnail=128)
Researchers posted AI-generated nude images to Twitter to see how the company responds to reports of copyright violation versus reports of nonconsensual nudity.
![Twitter Acts Fast on Nonconsensual Nudity If It Thinks It’s a Copyright Violation](https://lemmy.world/pictrs/image/f1cb7b0a-ec63-4230-9f56-78c17462e3ec.jpeg?format=webp)
Twitter will remove nonconsensual nude images within hours as long as that media is reported for having violated someone’s copyright. If the same content is reported just as nonconsensual intimate media, Twitter will not remove it within weeks, and might never remove it at all, according to a pre-print study from researchers at the University of Michigan.