Superintelligence will be the most impactful technology humanity has ever invented, and could help us solve many of the world’s most important problems. But the vast power of superintelligence could also be very dangerous, and could lead to the disempowerment of humanity or even human extinction.
While superintelligence seems far off now, we believe it could arrive this decade.
lol “we’re dedicating 20% of our resources towards the single most important problem in the history of humanity”. The fact that they are willing to put “We think super-intelligence could arrive this decade” and “we’re dedicating 20% of our resources to alignment” is mind-blowing. It might as well be “we think human extinction is inevitable but stopping it would be expensive”.
I don’t know why but this has a really eery aftertaste to it