• BluesF@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago

      AI doesn’t have a mind to do mental leaps, it only knows syntax. Just a form of syntax so, so advanced that it sometimes accidentally gets things factually correct. Sometimes.

      • Archpawn@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        It’s more advanced than just syntax. It should be able to understand the double meanings behind riddles. Or at the very least, that books don’t have scales, even if it doesn’t understand that the scales that a piano has aren’t the same as the ones a fish has.

        • BluesF@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          3 days ago

          It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.