There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.
I deleted every image from the past 24 hours personally, using the following command: sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;
Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.
Update
Apparently the Lemmy Shitpost community is shut down as of now.
I see where you're going with this, but no, people really are just absolutely horrible. The fact is that with other social media they're just already very set up in managing this so we never see it. Lemmy wants to be open, this is the flipside of that openness.
It's generally easy to crap on what's 'bad' about big players, while underestimating or undervaluing what they are doing right for product market fit.
A company like Meta puts hundreds of people in foreign nations through PTSD causing hell in order to moderate and keep clean their own networks.
While I hope that's not the solution that a community driven effort ends up with, it shows the breadth of the problems that can crop up with the product as it grows.
I think the community will overcome these issues and grow beyond it, but jerks trying to ruin things for everyone will always exist, and will always need to be protected against.
To say nothing for the far worse sorts behind the production and more typical distribution of such material, whom Lemmy will also likely eventually need to deal with more and more as the platform grows.
It's going to take time, and I wouldn't be surprised if the only way a federated social network eventually can exist is within onion routing or something, as at a certain point the difference in resources to protect against content litigation between a Meta and someone hosting a Lemmy server is impossible to equalize, and the privacy of hosts may need to be front and center.
The solution in this case is absolutely AI filters. Unfortunately you won't find many people willing to build robust model for that. Because they'd be those getting the ptsd you mention.
Iirc, ptsd is something only certain characters get. We should probably focus on finding people who really have no problem watching rough content. I have ptsd so I probably am not the right person for the job.
I don't want to try. I have pretty low barrier. I set up NSFW filter on lemmy because I found disturbing the furry content that was common some time ago... I don't want even to try anything worst than that