• mindbleach
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 days ago

    LLMs are fundamentally not designed to be reliable. That is not how they work. That is not the function they optimize. That is not the problem they solve.

    You’re making a fish climb trees.