Australasian Science: Australia's authority on science since 1938

Global Catastrophic Risk

By Michael Cook

A report calculates that we’re more likely to die in an extinction event than in a car crash.

The full text of this article can be purchased from Informit.

I have not been blessed with a refined taste in cinema, with my favourite movie franchise being the Terminator series, especially the second and third in which Arnie is in peak form. Alas, there’s not enough space here to reminisce, so let’s confine ourselves to the premise.

On 29 August 1997, Skynet, an artificial intelligence system created by the US Defense Department, became self-conscious. Its programmers panicked and tried to deactivate it. Skynet defended itself by provoking a nuclear exchange in which three billion people died and the reset were enslaved or hunted down. Until John Connor organised the Resist…

Sorry, we must stop here as I’ve promised the Editor I’d talk about ethics.

There is a minor academic industry in studying the ethics of global existential risks like a machine super-intelligence taking over the world. The Global Priorities Project and the Future of Humanity Institute, both based at Oxford University, recently produced the Global Catastrophic Risk 2016 report.

According to their calculations, extinction of the whole human race is reasonably likely. Experts have suggested that the risk is 0.1% per year, and perhaps as much as 0.2%. While this may not seem worth worrying about, these figures actually imply that “an individual would be more than five times as likely to die in an extinction event than a car...

The full text of this article can be purchased from Informit.

Comments

Comments on this article are available exclusively to Australasian Science Magazine subscription holders. Subscribe here.