• Kecessa
    link
    fedilink
    arrow-up
    18
    arrow-down
    5
    ·
    edit-2
    4 hours ago

    Pigeon = edible bird

    Cleaning a bird > preparing a bird after killing it (hunting term)

    AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued

    If you make a research for “how to clean a dirty bird” you give it better context and it comes up with a better reply

    • DannyBoy
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      edit-2
      3 hours ago

      The context is clear to a human. If an LLM is giving advice to everybody who asks a question in Google, it needs to do a much better job at giving responses.

        • bluewing@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          38 minutes ago

          Honestly, perhaps more people ask about how to clean and prep a squab vs rescuing a dirty pigeon. There are a LOT of hungry people and a LOT of pigeons.

        • HereIAm@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          1 hour ago

          Ah yes. I always forget to remove the label from my hunted bird. Cleaning “the top bone” is such a chore as well.

      • bluewing@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        40 minutes ago

        Bought in a grocery store - see squab - they are usually clean and prepped for cooking. So while the de-boning instructions were not good, the AI wasn’t technically wrong.

        But while a human can make the same mistake and many here just assume the question was about how to wash a rescued pigeon - maybe that’s not the original intent - what human can do that AI cannot is to ask for clarification to the original question and intent of the question. We do this kind of thing every day.

        At the very best, AI can only supply multiple different answers if a poorly worded question is asked or it misunderstands something in the original question, (they seem to be very bad at even that or simply can’t do it at all). And we would need to be able to choose the correct answer from several provided.

    • FlorianSimon
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      edit-2
      3 hours ago

      I like how you’re making excuses for something that it is very clear in context. I thought AI was great at picking up context?

      • lunarul@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 hour ago

        I thought AI was great at picking up context?

        I don’t know why you thought that. LLMs split your question into separate words and assigns scores to those words, then looks up answers relevant to those words. It has no idea of how those words are relevant to each other. That’s why LLMs couldn’t answer how many "r"s are in “strawberry”. They assigned the word “strawberry” a lower relevancy score in that question. The word “rescue” is probably treated the same way here.

      • iAmTheTot
        link
        fedilink
        arrow-up
        4
        ·
        2 hours ago

        I don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.

    • huginn@feddit.it
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      3 hours ago

      Let me take the tag off my bird then snap it’s wings back together