H uge catastrophes have occurred throughout history, some caused by the human race and others by the unforgiving hand of Mother Nature. But what could possibly wipe out the whole human race? And what is the chance of it happening over the next century?
Unsurprisingly, the chance of humanity being wiped out by 2100 is very widely uncertain. Estimates range between a worrying 50 per cent, according to astronomer and former master of Trinity College Cambridge, Martin Rees, in his 2003 book Our Final Hour, to the more conservative, but still concerning 10 percent, detailed in the 2006 Stern report. ten per cent! It’s really quite alarming, even if it is incredibly subjective. Leicester City had a smaller chance of winning the Premier League this year, and we all know what happened there. Why is it such a high figure? And what is the biggest threat to the human race?
‘the historical resilience of the human race to environmental catastrophes and biological threats has meant that the estimated risk is incredibly low’
The historical resilience of the human race to environmental catastrophes and biological threats has meant that the estimated risk is incredibly low. It would take a great deal of bad luck for a meteor impact or intense volcanic event to occur at a big enough scale to alter the climate system enough to deem us unable to survive during the next 100 years. So far, life has always found a way to keep going; however, the increase in extreme climatic events such as droughts and storms, and the melting of the ice sheets are rarely taken into consideration in these estimates.
As is the usual case, we’re most likely to be the biggest problem. Our own lack of knowledge around technological advances could, ironically, be the cause of our downfall… how poetic. In 2008 it was suggested by the Future of Humanity Institute at the University of Oxford that both the development of molecular nanotechnology weapons and super-intelligent artificial intelligence (AI) were the biggest threats to the existence of mankind, both with a five per cent likelihood of wiping us off the face of the planet. Our own bloody desire to kill each other is the next most likely extinction scenario, although the scale of nuclear war is not believed to be large enough to cause the extinction of human beings. If we’re all going to die before 2100, it is more likely to be at the hands of traditional weapons.
Despite being the most uncertain, the threat of molecular nanotechnology and super intelligent AI are by far the most interesting estimates. Molecular nanotechnology hypothetically works through a process known as Mechanosynthesis, the use of mechanical constrains in order to manipulate reactive molecules into bonding onto specific sites, effectively controlling reaction outcomes. This technology could provide us with the opportunity to build new, dangerous weaponry.
‘the main hypothetical risk of molecular nanotechnology is the potential for a phenomenon known as the “technological singularity” to develop’
The main hypothetical risk of molecular nanotechnology is the potential for a phenomenon known as the ‘Technological Singularity’ to develop. The same idea applies to AI. A ‘Technological Singularity’ is the ability of technology to self-replicate and improve itself, causing a runaway effect that surpasses any human ability to counteract it. It is believed that, in the same way that cancer does, nanotechnology weapons could garner overwhelming power, capable of becoming the dominant power on the planet. It all sounds very sci-fi , but our lack of hard knowledge about it is what makes it a risk. They say that curiosity killed that cat. Well, it looks like it might kill the entire human race too. Great.