Israel has deployed a mass facial recognition program in the Gaza Strip, creating a database of Palestinians without their knowledge or consent, The New York Times reports. The program, which was created after the October 7th attacks, uses technology from Google Photos as well as a custom tool built by the Tel Aviv-based company Corsight to identify people affiliated with Hamas.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    93
    arrow-down
    4
    ·
    edit-2
    9 months ago

    Israel is the type of control-heavy far-right state other dictators wish they could govern, and it’s made possible by Western money and technology (I was going to name just the US but my country of Canada, among others, is not blameless either). This news also sucks because there’s no way that tech is staying in Israel only. Citizens of the world better brace for convictions via AI facial recognition.

    “Our computer model was able to reconstruct this image of the defendant nearly perfectly. It got the hands wrong and one eye is off-center, but otherwise that’s clearly them committing the crime.”

    • wanderingmagus@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      9 months ago

      From what I remember, AI facial recognition tech was already being used by police and agencies worldwide, like the FBI, PRC police etc, or am I misinformed? I remember something about Chinese and American facial recognition software.

      • GrymEdm@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I had not read anything like that but a quick search pulled up this story from last September by Wired that supports your post: FBI Agents Are Using Face Recognition Without Proper Training. “Yet only 5 percent of the 200 agents with access to the technology have taken the bureau’s three-day training course on how to use it, a report from the Government Accountability Office (GAO) this month reveals.” So it sounds like you’re right, and also that they are probably inadequately trained even if they complete all 3 days on how to identify people with legal ramifications.

        • wanderingmagus@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          9 months ago

          And I wonder how many of those 95% have already used misapplied AI facial recognition to justify FISA court warrants for stalking investigating random people suspected terrorists?

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      edit-2
      9 months ago

      Facial tattoos of drop table commands. Embed computer worms into your iris. We can get insane to fuck all this shit up too. I bet theres a way to embed a computer virus on your own face.

      • GrymEdm@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I guess I’ll adjust my life goals to “hot cyberpunk partner in technological dystopia”, because that sounds like some Bladerunner/Cyberpunk 2077 stuff.

        • Sanctus@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 months ago

          Its not that far off. We’ll see exactly what I said soon enough. You can put a virus or worm inside an image in an email. You can do the same thing with a tattoo. Its unfortunate it will be here so far before the superhuman cybernetics.

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            9 months ago

            You can put a virus or worm inside an image in an email.

            I’d much prefer that people who haven’t done this wouldn’t talk.

            • Sanctus@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              Are you implying you can’t use steganography techniques on real objects and images? You act like I stated it would be easy.

              • rottingleaf@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                9 months ago

                OK, so who’ll decode your “virus” from those real objects? Or it’s a case of “I’m a poor Nigerian virus, please kindly run me with root privileges on a system with such and such”?

                EDIT: I mean, steganography is too a word a person should know the meaning of before using.

                • Sanctus@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  9 months ago

                  Just because you said this wouldn’t work like SQL Injection, does not mean it won’t. You don’t know either. Have you worked on facial recognition databases? How do they store their data? Its most likely just a database. Then I would start by looking at steganography techniques to see how those can be applied. Obviously I’m not hiding an executable in there, but I don’t see why you couldn’t try for unsanitized input, you never know. Now if you want to continue into realism, you would just wear a full face mask outside. You also never answered my question about steganography.

                  • rottingleaf@lemmy.zip
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    9 months ago

                    Your question doesn’t make any fucking sense in the context of attacking anything, steganography is encoding your message inside redundant encoding for something else.

                    So, about that word.

                    A “virus in an image” situation is for cases when a program which will open that image has some vulnerability the attacker knows about, so the image is formed specifically to execute some shellcode in this situation.

                    Same with “a virus in an MP3”, some MP3 decoder has a known vulnerability allowing a shellcode.

                    Same with PDFs and anything else.

                    There are more high-level situations where programs with their own complex formats (say, DOCX which is a ZIP archive with some crap inside) execute stuff.

                    All this is not steganography.

                    Steganography is when, a dumb example, you have an image and you hide your message in lower bits of pixel color values. Or something like that with an MP3 file.

                    Obviously I’m not hiding an executable in there, but I don’t see why you couldn’t try for unsanitized input, you never know.

                    Attacks are a matter of probabilities, and “you never know” doesn’t suffice.

          • Lath@kbin.earth
            link
            fedilink
            arrow-up
            2
            ·
            9 months ago

            Sounds like a great time to start a costume & mask making company named “The ministry of silly walks”.

        • wanderingmagus@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          9 months ago

          Honestly with enshittification “technological dystopia” sounds like exactly where we already are. Now, if only implants weren’t being R&D’d by Muskrat and there were some open source non-invasive version…

      • fruitycoder
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        Attempts at adversial ai tatoo, face masks and clothing have been done before. Basically exploiting the model not having a deeper understanding of the world so you can trick it with specific visual artifacts.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        9 months ago

        Which may even work with 0.001% probability of that recognized string not being screened.

        There’s a difference between SQL injections on thematic web forums and the same in such a system.

        That “we can … too” is lazy complacency. “They” will get even stronger while “we” talk like this.

        • Sanctus@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          9 months ago

          Nothing is casual about this. Be pessimistic if you want. But we will not stop jabbing the eye that watches. This is an arms race.

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            9 months ago

            What I’m saying is that you personally haven’t done any of this and look stupid.

            Yep, people do use vulnerabilities in software and hardware to do things. Just not you, so that “we” seems weird.

            Neither did I, I just played with crackmes and shellcodes a bit, but I’m not the person writing pretentious posts with that “we”.

            • Sanctus@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              The original commentor I replied to was speculating about this being commonplace. You came in with your statements about people having to do things to talk about them in a post about speculation.