I’m not super happy with the lack of supporting data, just colorful graphs and a lot of emotional anecdotes but there is a clear disparity that I think warrants greater investigation. I love the credit union movement and was shocked to read this article. I would love to hear your opinions lemmings!

  • @[email protected]
    link
    fedilink
    517 months ago

    With respect to data, there does seem to be a damning amount of it in the CFPB dataset they analyzed for the article. The fact that approvals were this disproportionate even when accounting for “income, debt-to-income ratio, property value, downpayment percentage, and neighborhood characteristics” is alarming. Specifically with respect to income, approval for lowest-quartile whites exceeded that of highest quartile blacks. Yes, credit score was not available in the dataset, but we know it doesn’t fully explain the gap because of its frequency as a cited reason for denial, and reliance on credit doesn’t really do much to dig NFCU out of this hole IMO.

    I’m tempted to agree with the authors assessment that the use of automated tools by the underwriters is a likely contributor. Use a tool trained on historically racist data and practices, and that’s what you’ll get more of.

    • @[email protected]
      link
      fedilink
      English
      107 months ago

      We have a saying at work regarding data, “garbage in, garbage out”. If they’re using clearly racist historical data they deserve this shitstorm. They really should have known better

          • @[email protected]
            link
            fedilink
            17 months ago

            As long as you’re not a software engineer with a decade of experience in, I ain’t shook.

            If you are, then I am not mad at you- just whoever was supposed to teach or mentor you.

  • @[email protected]
    link
    fedilink
    97 months ago

    How would a bank or credit union even know what racial background a loan applicant comes from? I have a mortgage, and I’ve had auto loans and personal loans in the past. Not once did I ever see a bank employee face-to-face, even for my mortgage.

    A suppose the sound of a person’s voice or their name could give some clues on certain occasions.

    • @lurch
      link
      177 months ago

      In the US lots of forms have race or ethnicity input fields. It always baffles me as a European. Like how is that relevant, except for discrimination.

        • @[email protected]
          link
          fedilink
          37 months ago

          Very much for this. In schools in decent states the students self identity. That is then used to look for over representation in suspension or expulsion.

          How else can mathematically prove bias or discrimination?

          Individually, the ethnicity is known or presumed by staff but without it being known systemically it can’t be addressed systemically.

        • @ryathal
          link
          17 months ago

          It’s easier to ask for it upfront and ban companies from using than try and reconstruct the data to analyze after the fact. There scale of discrimination was so severe in housing the government forces this information to be collected and reported for all applications, because it’s easier to detect the discrimination that way. There’s also penalties for not submitting the race or ethnicity on enough applications.

    • @[email protected]
      link
      fedilink
      English
      147 months ago

      Creditors have access to your entire life in the background. So even if your loan application doesn’t have race/ethnicity on it, your credit file sure as hell does.

    • netburnr
      link
      fedilink
      English
      117 months ago

      Equfax knows a whole lot about you, more than just your race.

    • @[email protected]
      link
      fedilink
      107 months ago

      They run your information against a number of services when you apply for a loan. That data is 100% available to them. Source: I work with this data daily.

    • @[email protected]
      link
      fedilink
      English
      77 months ago

      Like the author says this is probably due to how automated systems were trained. They weren’t made to be racist, but based on certain traits are more likely to reject, and that ends up making them discriminate against black applicants. I’m thinking: neighborhood, street, schools attended, stuff like that.

  • @[email protected]
    link
    fedilink
    37 months ago

    They tried to make their decisions too clever. The analysis of personal background beyond basic measures is a problem. That additional analysis is where biases breed.