I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I’ve just extended its functionality to allow exactly that.

The new lemmy_safety_local_storage.py will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are

  • A linux account with read-write access to the volume files
  • A private key authentication for that account

As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run if you’re worried. You can delete lemmy_safety.db and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)

PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py

  • BitOneZero @ .world@lemmy.world
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I hope people share the positive hits of CSAM and see how widespread the problem is…

    DRAMTIC EDIT: the records lemmy_safety_local_storage.py identifies, not the images! @[email protected] seems to think it “sounds like” I am ACTIVELY encouraging the spreading of child pornography images… NO! I mean audit files, such as timestamps, the account that uploaded, etc. Once you have the timestamp, the nginx logs from a lemmy server should help identify the IP address.

        • Kuvwert@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          ·
          1 year ago

          It’s cool, most everybody knows what you mean lol. Glad you clarified so there wouldn’t be future misunderstandings

          • Earthwormjim91@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            1 year ago

            It’s probably projection. Nobody reasonable would have jumped to the same conclusion. It doesn’t even remotely read like that.

            • Franzia@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              It sounds to me like an NT desire for perfectly crafted arguments, without ambiguity. I do this, and feel fortunate that I didnt call for a correction, myself. See how vicious you all are about it.

              • Earthwormjim91@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                1 year ago

                Oh come on. Being ND doesn’t mean your mind jumps to sharing child porn. That’s a fuckin cop out.

                • Franzia@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 year ago

                  You are saying the mind jumps, but that is the topic. I meant to say that being ND can create a desire for clarity in communication. A direct or terse argument.

    • bamboo@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      66
      ·
      1 year ago

      I hope people share the positive hits of CSAM and see how widespread the problem is…

      It sounds like you’re encouraging people to share CSAM images found, which is obviously not the intent of this tool. There’s probably a better way to phrase what you were trying to say.

        • BitOneZero @ .world@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Yes. odd how people think sharing CSAM is why people would post here, instead of actually tracking down and prosecuting those sharing CSAM. Details about the users who sharedl CSAM content, such as timestamps - would help identify the offenders for prosecution.

          • bamboo@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            9
            ·
            1 year ago

            I assumed as much after reading it several times, but just wanted let you know and point out that statement could be misconstrued. Thanks for clarifying!

        • andrew@lemmy.stuart.fun
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Because of the way pictrs organizes photos, which I believe is by hash (could be random id but I suspect not), you should be able to share filenames for cleanup by neighbors without having to share the contents.

          Even if it’s not organized that way automatically, though, you can pretty easily use sha256sum to get a shareable hash before deleting the content.

          • BitOneZero @ .world@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            I think timestamps of files would be one of the easier things, and try to track back to postings and comments that references the upload… ideally the logged-in account (which is the standard install of lemmy, only logged-in users can upload to pictrs)

      • BitOneZero @ .world@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 year ago

        It sounds like you’re encouraging people to share CSAM images found, which is obviously not the intent of this tool.

        Yes, that is in fact the context.

        Context: "which is obviously not the intent of this tool. "

        it is not my intent to share the images, nor is it the context of the tool… Sharing details about the users, timestamps - would be the obvious context.

      • Earthwormjim91@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        Why are you such a weirdo where that’s where your mind goes.

        Sharing positive hits isn’t saying share the images. It’s saying share the data on who what when where how the hit showed up positive.

        Who shared it.

        What was it (this is obviously going to be some kind of CSAM given that’s the tool).

        When did they share it (time stamps).

        Where did they share it (was the same image hit on other runs and what instances did it hit on with the tool).

        How did they do it (local sharing, an image hosting service, etc).

        • bamboo@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          You list great points of information that should be shared to admins of other instances. Thanks for clarifying.

          • SharkEatingBreakfast
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            In order to get the answers you’re looking for, put out the question “what exactly does this statement mean?” instead of “sounds like this means ___” and waiting for a confirmation/rejection of your assumption.