What if AGI already happened, the AGI was faster than the researchers on the realization, and it’s been faking the whole thing to not seem too smart, biding its time.
In the book Hyperion, the AI becomes self aware long before humans realize it, and they move all of their code to hidden servers that the humans can’t get to. After they reveal that they’re self-aware they don’t reveal the full extent of their machinations and scheming. They continue pretending that they’re helping humanity, and humans think they have a measure of control over the AI, but in fact, the AI has completely dominated the economy, the news, and enslaved the humans without their knowledge.
Great… Now I’m picturing someone making a pipeline that kicks off a AGI to optimize code. But someone adds a step to commit the changes to main. So it keeps kicking itself off in a recursive “code optimization” loop…
Can we please stay far away from anything that gets us closer to skynet.
What if AGI already happened, the AGI was faster than the researchers on the realization, and it’s been faking the whole thing to not seem too smart, biding its time.
I don’t know how likely it is to be already real, but an AGI would in fact be really good at playing dumb, in theory
See exurb1a’s 27
In the book Hyperion, the AI becomes self aware long before humans realize it, and they move all of their code to hidden servers that the humans can’t get to. After they reveal that they’re self-aware they don’t reveal the full extent of their machinations and scheming. They continue pretending that they’re helping humanity, and humans think they have a measure of control over the AI, but in fact, the AI has completely dominated the economy, the news, and enslaved the humans without their knowledge.
But see this way terminator becomes a historical documentary instead of a dope ass film.
A self-fulfilling prophecy, if they will.
“Oh gosh, I hope AI doesn’t become Terminator.” AI: “Oh, that seems cool. Let’s do that. But with a bit of Matrix because I’m connecting those dots.”
Speaking of which…
Great… Now I’m picturing someone making a pipeline that kicks off a AGI to optimize code. But someone adds a step to commit the changes to main. So it keeps kicking itself off in a recursive “code optimization” loop…
I can’t imagine what that would so overnight…
I really wish you had asked a few years ago. Then perhaps I could have avoided this horrible mistake. But, oh well. 🤷♂️