We have no pathway to AGI yet. The “sparks of AGI” hype about LLMs is like trying to get to the Moon by building a bigger ladder.
Far better chance that someone in the Pentagon gets overconfident in the capabilities of unintelligent ML and hooks a glorified chatbot into NORAD and triggers another missile minuteman crisis that goes the wrong way this time because the order looks too confident to be a false positive.
We have no pathway to AGI yet. The “sparks of AGI” hype about LLMs is like trying to get to the Moon by building a bigger ladder.
Far better chance that someone in the Pentagon gets overconfident in the capabilities of unintelligent ML and hooks a glorified chatbot into NORAD and triggers another missile minuteman crisis that goes the wrong way this time because the order looks too confident to be a false positive.
I never said I thought we would get to ASI through LLMs. But we still have a good change of getting there soon.