Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      8 months ago

      If you ask a person to describe a Nazi soldier, they won’t accidentally think you said “racially diverse Nazi soldier”

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        8 months ago

        Should have been specific. I meant the point that it sometimes does stupid shit in attempts to be inclusive.

        However, if you tell someone “hey I want you to make racially diverse pictures. Don’t just draw white people all the time” and then you later come back and ask them to “draw a German soldier from 1943.” Can you really accuse them of not thinking if they draw racially diverse soldiers?

        • bionicjoey@lemmy.ca
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          8 months ago

          Yes. If I’m an artist and my boss says “hey I want you to try to include more racial diversity in your drawings” and then says “your next assignment is to draw some Nazi soldiers”, I can use my own implicit knowledge about Nazis to understand that my boss doesn’t want me to draw racially diverse Nazis. This is just further evidence that generative models are not true intelligences.

          • EatATaco@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            8 months ago

            I can use my own implicit knowledge about Nazis to understand that my boss doesn’t want me to draw racially diverse Nazis.

            I don’t even know how “implicit knowledge” applies here, but it sounds like you’re really just assuming that the previous order no longer applies. One could also assume that it still applies. I think the latter is actually the more reasonable assumption, assuming this all happens ins vacuum.

            I just know that it I told one of my reports to add more diversity, and then they added diversity to pictures of nazis, but that’s not what I wanted, then I would take that as my fault, not accuse them of not thinking.

            • bionicjoey@lemmy.ca
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              8 months ago

              No, because anyone who knows what a Nazi is and trusts that the person giving them instructions is not insane can assume that the first directive is meant to be a general note for their future work and not to be applied to the second directive. If one wanted pictures of racially diverse Nazis, they would need to be more explicit.

              • EatATaco@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                3
                ·
                8 months ago

                the first directive is meant to be a general note for their future work and not to be applied to the second directive

                This is the root question, which you just gloss over. Why? It’s a general note, why should one assume it doesn’t apply? You seem to be saying “it applies except when it doesn’t.” It would seem to be that the rational thing to do would be to assume that the general note applies unless you’re explicitly told otherwise, or there is some good reason to believe this wasn’t the intent.

                Also, fyi, the request was for German soldiers, not nazis.

                And don’t get me wrong, I agree with you that it should not generate black German soldiers from 1939 without being explicitly told to do so. But I think this is a problem with it’s directives rather than evidence that it’s not thinking.