• @[email protected]
    cake
    link
    fedilink
    English
    421 month ago

    hallucination refers to a specific bug (AI confidently BSing) rather than all bugs as a whole

    • @[email protected]
      link
      fedilink
      English
      161 month ago

      Honestly, it’s the most human you’ll ever see it act.

      It’s got upper management written all over it.

    • @[email protected]
      link
      fedilink
      English
      -4
      edit-2
      1 month ago

      (AI confidently BSing)

      Isn’t it more accurate to say it’s outputting incorrect information from a poorly processed prompt/query?

      • @[email protected]
        link
        fedilink
        English
        311 month ago

        No, because it’s not poorly processing anything. It’s not even really a bug. It’s doing exactly what it’s supposed to do, spit out words in the “shape” of an appropriate response to whatever was just said

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          1 month ago

          When I wrote “processing”, I meant it in the sense of getting to that “shape” of an appropriate response you describe. If I’d meant this in a conscious sense I would have written, “poorly understood prompt/query”, for what it’s worth, but I see where you were coming from.