• mindbleach
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 hours ago

    “Mistake” is a misguided label. This system has no idea what’s real. It’s just completing plausible sentences.

    It’s not doing critical analysis. It’s guessing words. It’s eerily close, sometimes - but all these efforts to make it an oracle are a soup sandwich.

    LLMs are not the kind of neural network that will accomplish this task reliably. It’s simply not what they’re for. Plausibility will suffice when drawing a hand, but if you ask it to draw the back of your hand, it will have no such information, but it may try anyway.