Asteroids! Solar Storms! Nukes! Climate Calamity! Killer Robots!

A guide to contemporary doomsday scenarios — from the threats you know about to the ones you never think of.

And here we are at the crux of our existential predicament as a species: There are just so many things we don’t want to happen. There are so many potential doomsdays.

This is not the cheeriest topic, to be sure, but it’s endlessly fascinating if you can stomach it. What are our biggest existential risks? Should we feel more threatened by low-probability but high-consequence risks, such as asteroid impacts and runaway artificial intelligence (robot overlords and whatnot), or should we focus on less exotic, here-and-now threats such as climate change, viral pandemics and weapons of mass destruction? And should we even worry about low-probability risks when hundreds of millions of people right now lack adequate food, water, and shelter and are living off less than $2 a day?

What about a potentially catastrophic misuse of genetic engineering, including the revolutionary CRISPR gene-editing technique? I posed that question to Jennifer Doudna, the Nobel laureate who co-invented CRISPR and who has been outspoken in warning against misuse of the technology. By email, she pointed out that researchers are using the technology to help humanity on multiple fronts, including health, agriculture and climate strategies. As for existential risks, “currently there are significant technical as well as knowledge barriers to using genetic engineering in ways that could threaten our society at scale.”




IGI / UC Berkeley



This is a unique website which will require a more modern browser to work!

Please upgrade today!