Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @[email protected] the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • newIdentity
    link
    19 months ago

    You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

    Sorry let me word this correctly: social media wouldn’t exist.

    My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

    The problem is that it won’t stop and that it will happen again.

    I’d bet money that the following will happen:

    1. community gets turned off
    2. csam gets deleted, posters are identified, information turned over to law enforcement
    3. community gets turned back on.

    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

      • newIdentity
        link
        1
        edit-2
        9 months ago

        I don’t want to write a long text so here is the short version: These automated tools are not perfect but they don’t have to be. They just have to be good enough to block most of it. The rest can be done through manual labor which also people have done voluntarily on reddit. Reporting needs to get easier and you can prevent spammers from rate limiting them.

        To be clear, I don’t have anything against temporarly shutting down a community filled with CP until everything is cleared up. But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

        I’m sorry for the grammatical mistakes. I’m really tired right now and should probably go to bed.