Oxford Future of Humanity Institute Knows What Will Make Us Extinct
How often do you ponder exactly what will drive humans into extinction?
Well, if you're a researcher at Oxford University's Future of Humanity Institute, probably a lot. The institute's new research paper, entitled "Existential Risk As A Global Priority" (which they could have much more catchily called "Dude, This Might Be It For Us") calls on world leaders to resolve the problem of, um, wiping out the species.
We'll explain. Your high school English teacher might have explained that, within fiction, there are three major themes: man against man, man against nature, and man against himself. The institute examines each possibility to figure out which option is the most likely to end our time as a species, and here's a hint: it probably won't end like the movie 2012 did. (Also because it's 2013, so we've already beaten that one.)
Meteors? Floods? Earthquakes? Climate change? Well, yes, all valid claims to extinction, says the institute — but let's not forget that Earth is teeming with humans these days. It's pretty unlikely that any one natural disaster will wipe out every single human, plus that's all a bit been there, done that — we've already survived everything that the universe (literal universe, not karmic universe) has so far thrown at us.
According to the institute, the culprit for wiping us all of the face of the earth will probably be — slightly underwhelmingly — science.
Yup, science: the same thing that's doubled our life expectancy in the last century and a half, wiped out smallpox, and, um, everything else that science has done (there's a Wikipedia page, if you're interested). What will probably happen, said the institute, is some variant of a bad experiment gone awry, a finding getting into the wrong hands, or an ton of robots. The institute's chief scientist likened the field of science to a child playing with a dangerous weapon, maybe forgetting that he was a leading scientist.
So what kind of experiments might go awry? Well, synthetic biology, for one — growing human organs, prolonging life, and so on. Researchers are already growing tiny organs from human cells all over the world. In other extinction risks, we have the rise of machines: robots are becoming increasingly intelligent, much like every sci-fi movie starring Will Smith. And then we have nanotechnology, which is a fancy word for nuclear and chemical weapons.
The problem is, thanks to the last century of innovation (see another Wikipedia page), the human race now faces more risks than we ever have from anything. More worryingly, unlike natural disasters, humanity has absolutely zero track record of ever surviving any of the threats it faces at present. The institute said that the issue of what is most likely to end humanity deserves attention: "It's not science fiction, religious doctrine or a late-night conversation in the pub," said its chief. "There is no plausible moral case not to take it seriously."
The BBC certainly takes it seriously. From its article on the research:
So should we be worried about an impending doomsday? This isn't a dystopian fiction. It's not about a cat-stroking villain below a volcano. In fact, the institute in Oxford is in university offices above a gym, where self-preservation is about a treadmill and Lycra. Dr Bostrom says there is a real gap between the speed of technological advance and our understanding of its implications...
A treadmill and Lycra, eh? Now there's the real demise of the human race.
(Image: Flickr/Kevin Dooley)