Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • @[email protected]
    link
    fedilink
    1
    edit-2
    11 months ago

    CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you’ll see “cartoons, paintings, sculptures, …” in the wording of the protect act

    They don’t actually need a victim to be defined as such

    • @priapus
      link
      111 months ago

      That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

      • @[email protected]
        link
        fedilink
        1
        edit-2
        11 months ago

        I assumed it was the same thing, but they could be different now that you mention it.

        So child porn then. I don’t think there’s a debate either on whether the fediverse should defederate from child porn either…

        • @priapus
          link
          111 months ago

          That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

          • @[email protected]
            link
            fedilink
            111 months ago

            Ah. It depends on the jurisdiction the instance is in

            Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

            Lolicon is illegal under US protect act of 2003 and in plenty of countries