• LanternEverywhere@kbin.social
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

    For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!

    • Barbarian
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      It’s not lying. It has no concept of context or truth. Auto complete on steroids.