• bionicjoey@lemmy.ca
    link
    fedilink
    arrow-up
    24
    ·
    2 years ago

    ChatGPT doesn’t understand the things it says. It shouldn’t be treated as a source of truth. It can be tripped up by nuance, or by statements which require an understanding of the concept of syntax. For example, if you ask it what is the longest 5-letter word, it will confidently give you an answer.

    • femboy_link.mp4@beehaw.org
      link
      fedilink
      arrow-up
      11
      ·
      2 years ago

      I wish more people understood this. So many people out there are taking what ChatGPT spits out as gospel when it’s actually very susceptible to manipulation and misinformation.

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Agreed. However, often when trying to fix something you have run out of options and just want new ideas. In my experience, LLMs are very good for that.

    • Eheran@lemmy.fmhy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      2 years ago

      Here is what GPT 4 says:

      Me: What is the longest 5-letter word?

      ChatGPT:

      The concept of the “longest” 5-letter word can be interpreted in two ways:

      1. If “longest” refers to the physical length of the word (how it is pronounced), then it can vary based on different accents and languages. However, in English, five-letter words with a large number of syllables such as “queue” or “eerie” might be considered.

      2. If “longest” refers to the word with the most definitions or uses, it’s hard to definitively state which 5-letter word that would be, but “round” and “sound” are examples of five-letter words with a large number of definitions and uses.

      It’s also important to note that in English, all 5-letter words contain the same number of letters and would, therefore, have the same length if written in the same font and size.