• Tar_Alcaran
    link
    fedilink
    arrow-up
    9
    ·
    14 days ago

    Because 20q is basically a big decision tree. And LLMs don’t make decisions, they generate output based on what other output looks like.