• Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      6 hours ago

      For LLMs specifically, or do you mean that goal alignment is some made up idea? I disagree on either, but if you infer there is no such thing as miscommunication or hiding true intentions, that’s a whole other discussion.

      • eleitl@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        6 hours ago

        Cargo cult pretends to be the thing, but just goes through the motions. You say alignment, alignment with what exactly?

        • Rhaedas@fedia.io
          link
          fedilink
          arrow-up
          4
          ·
          5 hours ago

          Alignment is short for goal alignment. Some would argue that alignment suggests a need for intelligence or awareness and so LLMs can’t have this problem, but a simple program that seems to be doing what you want it to do as it runs but then does something totally different in the end is also misaligned. Such a program is also much easier to test and debug than AI neural nets.