• CameronDev@programming.dev
    link
    fedilink
    arrow-up
    45
    ·
    1 year ago

    It hasn’t been publically annouced yet, but Enya has bought the country formally known as Kenya, and have removed the K as marketting for their new album. The AI just worked it out before the announcement.

  • You999
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Kenya give me a hint on what the joke is?

    • dfyx@lemmy.helios42.de
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      1 year ago

      No joke here. Large language Models (which people keep calling AI) have no way of checking if what they’re saying is correct. They are essentially just fancy text completion machines that answer the question what word comes next over and over. The result looks like natural language but tends to have logical and factual problems. The screenshot shows an extreme example of this.

      In general, never rely on any information an LLM gives you. It can’t look up external information that wasn’t in its training set. It can’t solve logic problems. It can’t even reliably count. It was made to give you a plausible answer, not a correct one. It’s not a librarian or a teacher, it’s an improv actor who will „yes, and“ everything. LLMs will often rather make up information than admit that they don’t know. As an easy demonstration, ask ChatGPT for a list of restaurants in your home town that offer both vegan and meat-based options. More often than not, it will happily make you a list with plausible names and descriptions but when you google them, none of the restaurants actually exist.

      • PM_ME_FAT_ENBIES@lib.lgbt
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Large language Models (which people keep calling AI)

        As my AI teacher used to say, it’s AI until someone builds it.