• Deceptichum
    link
    fedilink
    arrow-up
    23
    ·
    5 months ago

    People are biased against resumes that imply a disability. ChatGPT is just picking up on that fact and unknowingly copying it.

  • boredtortoise@lemm.ee
    link
    fedilink
    arrow-up
    10
    ·
    5 months ago

    We’ve lived in a world where resume evaluation is always unjust. It’s just that. A resume can’t imply anything that can be used against you.

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    9
    ·
    5 months ago

    studies how generative AI can replicate and amplify real-world biases

    Emphasis mine. That’s a damn important factor, because the deep “learning” models are prone to make human biases worse.

    I’m not sure but I think that this is caused by two things:

    1. It’ll spam the typical value unless explicitly asked contrariwise, even if the typical value isn’t that common.
    2. It might take co-dependent variables as if they were orthogonal, for the sake of weighting the output.
  • SuperCub
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    I’m curious what companies have been using to screen applications/resumes before Chat GPT. Seems like they already had shitty software.

  • kata1yst
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    Yet again sanitization and preparation of training inputs proves to be a much harder problem to solve then techbros think.

  • andrew_bidlaw
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    Let the underwhelming brain in a jar decide if your disability would make you less efficient at your work.