• mindbleach
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 days ago

    LLMs are fundamentally not designed to be reliable. That is not how they work. That is not the function they optimize. That is not the problem they solve.

    You’re making a fish climb trees.

  • zib@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 days ago

    Or, and hear me out, perhaps we could dump this mess and move on to the next tech fad?