This site is no longer being updated all posts are still accessible in this archive.

Sign in with your favourite social login to share, comment and pin your favourites.

Science: 

12 ways the world could end, according to science

It's the end of the world as we know it, but we don't feel fine.

It’s the end of the world as we know it, but we don’t feel fine.

Here’s some cheery afternoon reading for you: scientists from the University of Oxford have written up a “scientific assessment about the possibility of oblivion“.

The research, from the Global Challenges Foundation and the Future of Humanity Institute is designed to weigh up the current front-runners in the race to end humanity. “It is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities,” the paper’s authors say.

As you might imagine, it’s quite a heavy read, so we’ve taken the time to give you an easily digestible summary of how our wonderful planet may come to a grisly end. So sit down, get comfy and consider your mortality as we explore the 12 options.

Current Risks

1. Extreme Climate Change

climate change

In like a depressing bullet at number one is Climate Change.

While most politicians pay lip service to climate change, the report’s authors do not see enough of the global cooperation necessary to avert a climate based disaster. The report claims that the world’s poorest countries are likely to take the brunt of the immediate damage on this one, with famines leading to huge migration trends and instability.

2. Nuclear War

nuclear war

First, the good news: the researchers state that nuclear war is less likely now than it was in the 20th century.

But we’re not out of the woods yet: “the potential for deliberate or accidental nuclear conflict has not been removed.”

3. Ecological Catastrophe

ecological catastrophe

We have to either look after our planet, or – the slight cop-out option – hope that we can get away without: “It seems plausible that some human lifestyles could be sustained in a relatively ecosystem independent way, at relatively low costs.”

“Whether this can be achieved on a large scale in practice, especially during a collapse, will be a technological challenge and whether it is something we want is an ethical question,” says the report, letting us read between the lines as to what the answer to said question might be.

4. Global Pandemic

global pandemic

The good news keeps coming: “There are grounds for suspecting that such a high impact epidemic is more probable than usually assumed.”

Unlike the other points in the report, this one is more out of our collective hands: there could just be an uncontrollable infectious disease out there waiting to strike, which our medicines will not be able to stop.

5. Global System Collapse

global system collapse

We’ve never before had such an interconnected economy, and it’s fine when everything is working as expected, but when it doesn’t? That’s right: riots, no law and order and civil unrest.

“The world economic and political system is made up of many actors with many objectives and many links between them. Such intricate, interconnected systems are subject to unexpected system-wide failures caused by the structure of the network.”

Exogenic Risks

6. Major Asteroid Impact

major asteroid impact

Asteroids come in a variety of sizes, but should one larger than 5k kilometers hit the Earth, it would destroy an area the size of Holland. How often do these happen? Ooh, every 20 million years or so.

It’s not just the initial impact, either. Clouds of dust from the hit would be projected into the atmosphere, potentially affecting the climate and causing biosphere damage. As you’ll read elsewhere here, these tend to affect food availability and cause a break-down in law and order.

7. Supervolcano

supervolcano

A supervolcano, like the asteroid hit, would cause destruction well beyond its immediate damage: a global volcanic winter caused by dust clouds blocking the sun. How severe? “The effect of these eruptions could be best compared with that of a nuclear war.” That’s bad, isn’t it?

Emerging Risks

8. Synthetic Biology

synthetic biology

Scientists are pretty good at their jobs, and the report is concerned that in the wrong hands, an ‘engineered pathogen’ could be created that could wipe out the human race.

But surely any unhinged scientist hoping to do that would be stopped in their tracks? “Attempts at regulation or self-regulation are currently in their infancy, and may not develop as fast as research does.” Oh.

9. Nanotechnology

nanotechnology

Nanotechnology may seem pretty innocuous, but everything can be weaponised, and smaller technology means more room for a dangerous weapon collection.

Or as the report puts it, “This could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing.” They speculate nanotechnology could allow for nuclear bomb construction, which takes us right back to Doomsday Scenario #2 in the list…

10. Artificial Intelligence

artificial intelligence

First the good news: advanced artificial intelligence “could easily combat most other risks in this report, making extremely intelligent AI into a tool of great potential.” Great! Problems solved, let’s break for lunch!

Oh, wait:

“Such extreme intelligences could not easily be controlled (either by the groups creating them, or by some international regulatory regime), and would probably act to boost their own intelligence and acquire maximal resources for almost all initial AI motivations. And if these motivations do not detail the survival and value of humanity, the intelligence will be driven to construct a world without humans.” 

Right. Back to the drawing board.

11. Unknown Consequences

uncertain risks

It may feel like a cop-out ‘cover-all’ option for ‘none of the above’, but Unknown Consequences covers, well, the unknown. “They constitute an amalgamation of all the risks that can appear extremely unlikely in isolation, but can combine to represent a not insignificant proportion of the risk exposure.”

Want an example? “One resolution to the Fermi paradox – the apparent absence of alien life in the galaxy – is that intelligent life destroys itself before beginning to expand into the galaxy.” So literally cool your jets, NASA.

Global Policy Risk

12. Future Bad Global Governance

bad governance

Governments are, in a sense, ‘damned if they do, and damned if they don’t’ in the report: “There are two main divisions in governance disasters: failing to solve major solvable problems, and actively causing worse outcomes.”

Extremely hard to predict, but things might get better: “Technological, political and social change may enable the construction of new forms of governance,” the report explains, before adding that these may be, “either much better or much worse.” Huh.

So there you have it: 12 ways the world might end. Now collectively, as a society, let’s roll the dice, shall we?

12 sided die

Cheer up! You’ve got to the end of this article without the world ending. Probably.

Related Articles

[ccode]
Yes, send me the latest
ESET news

Want to receive the best stories from Go Explore on a weekly basis? Enter your email address here to subscribe

Seen something great online?
Seen something great online?