Israel has deployed a mass facial recognition program in the Gaza Strip, creating a database of Palestinians without their knowledge or consent, The New York Times reports. The program, which was created after the October 7th attacks, uses technology from Google Photos as well as a custom tool built by the Tel Aviv-based company Corsight to identify people affiliated with Hamas.

  • Sanctus@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    edit-2
    9 months ago

    Facial tattoos of drop table commands. Embed computer worms into your iris. We can get insane to fuck all this shit up too. I bet theres a way to embed a computer virus on your own face.

    • GrymEdm@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      edit-2
      9 months ago

      I guess I’ll adjust my life goals to “hot cyberpunk partner in technological dystopia”, because that sounds like some Bladerunner/Cyberpunk 2077 stuff.

      • Sanctus@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 months ago

        Its not that far off. We’ll see exactly what I said soon enough. You can put a virus or worm inside an image in an email. You can do the same thing with a tattoo. Its unfortunate it will be here so far before the superhuman cybernetics.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          9 months ago

          You can put a virus or worm inside an image in an email.

          I’d much prefer that people who haven’t done this wouldn’t talk.

          • Sanctus@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            Are you implying you can’t use steganography techniques on real objects and images? You act like I stated it would be easy.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              9 months ago

              OK, so who’ll decode your “virus” from those real objects? Or it’s a case of “I’m a poor Nigerian virus, please kindly run me with root privileges on a system with such and such”?

              EDIT: I mean, steganography is too a word a person should know the meaning of before using.

              • Sanctus@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                9 months ago

                Just because you said this wouldn’t work like SQL Injection, does not mean it won’t. You don’t know either. Have you worked on facial recognition databases? How do they store their data? Its most likely just a database. Then I would start by looking at steganography techniques to see how those can be applied. Obviously I’m not hiding an executable in there, but I don’t see why you couldn’t try for unsanitized input, you never know. Now if you want to continue into realism, you would just wear a full face mask outside. You also never answered my question about steganography.

                • rottingleaf@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  9 months ago

                  Your question doesn’t make any fucking sense in the context of attacking anything, steganography is encoding your message inside redundant encoding for something else.

                  So, about that word.

                  A “virus in an image” situation is for cases when a program which will open that image has some vulnerability the attacker knows about, so the image is formed specifically to execute some shellcode in this situation.

                  Same with “a virus in an MP3”, some MP3 decoder has a known vulnerability allowing a shellcode.

                  Same with PDFs and anything else.

                  There are more high-level situations where programs with their own complex formats (say, DOCX which is a ZIP archive with some crap inside) execute stuff.

                  All this is not steganography.

                  Steganography is when, a dumb example, you have an image and you hide your message in lower bits of pixel color values. Or something like that with an MP3 file.

                  Obviously I’m not hiding an executable in there, but I don’t see why you couldn’t try for unsanitized input, you never know.

                  Attacks are a matter of probabilities, and “you never know” doesn’t suffice.

                  • Sanctus@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    9 months ago

                    So they’re just storing all this facial data unencoded somewhere? Theres no way to figure that out? There is no sort of encoding/decoding going on with the facial data at all? Its impossible chief back it up the bots won? I don’t think so man. People are gonna find all sorts of ways to fuck with this. Now you can join in the speculation or get expactorating all over this post. The choice is your’s.

        • Lath@kbin.earth
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          Sounds like a great time to start a costume & mask making company named “The ministry of silly walks”.

      • wanderingmagus@lemm.eeOP
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        9 months ago

        Honestly with enshittification “technological dystopia” sounds like exactly where we already are. Now, if only implants weren’t being R&D’d by Muskrat and there were some open source non-invasive version…

    • fruitycoder
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      Attempts at adversial ai tatoo, face masks and clothing have been done before. Basically exploiting the model not having a deeper understanding of the world so you can trick it with specific visual artifacts.

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      9 months ago

      Which may even work with 0.001% probability of that recognized string not being screened.

      There’s a difference between SQL injections on thematic web forums and the same in such a system.

      That “we can … too” is lazy complacency. “They” will get even stronger while “we” talk like this.

      • Sanctus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        Nothing is casual about this. Be pessimistic if you want. But we will not stop jabbing the eye that watches. This is an arms race.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          What I’m saying is that you personally haven’t done any of this and look stupid.

          Yep, people do use vulnerabilities in software and hardware to do things. Just not you, so that “we” seems weird.

          Neither did I, I just played with crackmes and shellcodes a bit, but I’m not the person writing pretentious posts with that “we”.

          • Sanctus@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            The original commentor I replied to was speculating about this being commonplace. You came in with your statements about people having to do things to talk about them in a post about speculation.