• Paul Edwards of Stanford University considers himself a “man of the apocalypse” and has developed a freshman course on preventing human extinction.
• Together with epidemiologist Steve Luby, they focused on three familiar threats to the species: global pandemics, extreme climate change and nuclear winter.
• They also view advanced artificial intelligence as a threat, calling it “super-serious”.
• At Stanford, Open Philanthropy awarded Love and Edwards more than $1.5 million in grants to launch the Stanford Existential Risk Initiative.
• Critics call the AI security movement unscientific and say its existential risk claims may sound closer to religion than research.
• Edwards still sees climate change as a big threat, but he thinks the security of artificial intelligence should be considered.