cross-posted from: https://kbin.social/m/[email protected]/t/401046

Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.

  • pimento64@sopuli.xyz
    link
    fedilink
    arrow-up
    33
    ·
    1 year ago

    Lies about what this is created for:

    • monitoring customer responses
    • scanning for ‘distressed’ women in danger

    Comprehensive list of all things this is created for:

    • ads
    • demoralization
    • identifying, stalking, and punishing malcontents
    • stalking prospective rape victims
  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    The thought police are about to swat your front door because your gate was off from a bad night’s rest and flagged you as a problem. Authoritarian criminal technology.

    • Teali0@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I worked at a restaurant part-time for some extra “fun” money before COVID. Once we went to take-out only during the pandemic, everything turned toxic and I was not happy with how the management were handling everything. They told us “be happy because you’re basically making $20 an hour with how well everyone is tipping” but the morale was absolutely abysmal.

      I was letting these teenagers know that they were getting poor treatment and management came to me and basically said “they look up to you, you can’t tell them those things. They are all going to be mad”, so I quit. They were mad at me for telling these teenagers that they don’t have to devote their life to this restaurant that began making record profits during one of the worst and uncertain times for many of these kids families.

      Basically what I’m getting at is, if they do ask the workers, they don’t want to hear how they actually feel.