• jonne@infosec.pub
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 days ago

    Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn’t know is equally annoying.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.