@Sixth0795 to Science [email protected]English • 4 months agoHuhimagemessage-square40arrow-up1560arrow-down111
arrow-up1549arrow-down1imageHuh@Sixth0795 to Science [email protected]English • 4 months agomessage-square40
minus-square@[email protected]linkfedilinkEnglish75•4 months agoIt all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
minus-square@[email protected]linkfedilinkEnglish1•4 months agoHey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s
It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
Hey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s