Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • sugar_in_your_tea
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    In which case, admins should err on the side of caution and remove something that might be illegal.

    I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.

    In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.