Rite Aid has agreed to a five-year ban from using facial recognition technology after the Federal Trade Commission found that the chain falsely accused customers of crimes and unfairly targeted people of color.

The FTC and Rite Aid reached a settlement Tuesday after a complaint accused the chain of using artificial intelligence-based software in hundreds of stores to identify people Rite Aid “deemed likely to engage in shoplifting or other criminal behavior” and kick them out of stores – or prevent them from coming inside.

But the imperfect technology led employees to act on false-positive alerts, which wrongly identified customers as criminals. In some cases, the FTC accused Rite Aid employees of publicly accusing people of criminal activity in front of friends, family and strangers. Some customers were wrongly detained and subjected to searches, the FTC said.

  • kboy101222
    link
    fedilink
    English
    arrow-up
    6
    ·
    11 months ago

    Agreed. The term my computer science professor used was “racial bias in, racial bias out”. It’s the exact same problem today as it was when Google’s image identification AI was identifying black people as monkeys and when Microsoft’s Twitter bot was made so racist that David Duke would’ve blushed