• TAG@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    We don’t need more discrimination in loan approval. A few years ago, Amazon built an AI that would look at resumes and rate how likely the candidate would be hired. The AI trained itself to recognize female sounding resumes (went to women’s only college, is involved in women’s organizations, does not use manly enough language) and flag those as undesirables.

    https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

    • LdyMeow
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      We don’t need but we’re going to get!!!

        • kase@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Ah ok. I don’t know much about it, but I’ve heard that AI could sometimes be negative toward commonly discriminated against groups because the data that it’s trained with is. (Side note: is that true? someone pls correct me if it’s not). I jumped to the conclusion that this was the same thing. My bad

          • adrian783@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            what it did it expose just how much inherent bias there is in hiring. even just name and gender alone.

          • SCB@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            That is both true and pivotal to this story

            It’s a major hurdle in some uses of AI

          • TAG@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            An AI is only as good as its training data. If the data is biased, then the AI will have the same bias. The fact that going to a women’s college was considered a negative (and not simply marked down as an education of unknown quality) is proof against the idea that many in the STEM field hold (myself included) that there is a lack of qualified female candidates but not an active bias against them.

        • matter@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          When buggy software is used by unreasonably powerful entities to practise (and defend) discrimination that’s dystopian…

          • SCB@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Except it wasn’t actually launched, and they didn’t defend its discrimination but rather ended the project.