• Tar_Alcaran
    link
    fedilink
    arrow-up
    9
    ·
    2 months ago

    Because 20q is basically a big decision tree. And LLMs don’t make decisions, they generate output based on what other output looks like.