Tuesday, May 24, 2016

Think Of "Dr. Strangelove" on Steroids


The good news is that your chance of dying in a car accident isn't very high, about one in 120 in the United States.

The bad news is that the risk of an average person dying from an extinction event is five times greater than the car accident risk.

So we demand cars that have the best brakes, stability control systems, crash absorbing zones, air bags, seat belts and more. We spend a fortune to build and maintain our highways and hire police to enforce our traffic laws.

Well then, what are we spending on that extinction event risk, the far more dangerous threat? Well, when you add it all up, it comes out to just about bugger all.

When it comes to extinction-level risks there are several. Climate change and nuclear war are 1 and 2. Britain's astronomer royal, Baron Martin Rees, gives us a no better than 50-50 chance that we'll succumb this century to what he calls "bio-terror or bio-error." According to Rees, now that we've privatized scientific research, there's stuff going on in the big corporate labs, unmonitored, that could easily wipe us out if someone goofs up or should it fall into the wrong hands. He gives many examples in his book, Our Final Hour, that - trust me - you probably don't want to read.

I was born at the start of the Cold War and grew up under the constant threat of nuclear Armageddon. When the Soviet Union collapsed we felt a weight had been lifted off our shoulders. The Doctor Strangelove era was over. Well that didn't last long. It's back. More people have nuclear arsenals and some of them are much more likely to use them than anyone who had them back in the 70s. And now we've got all these other extinction-grade threats.

"[N]early all of the most threatening global catastrophic risks were unforeseeable a few decades before they became apparent. Forty years before the discovery of the nuclear bomb, few could have predicted that nuclear weapons would come to be one of the leading global catastrophic risks. Immediately after the Second World War, few could have known that catastrophic climate change, biotechnology, and artificial intelligence would come to pose such a significant threat."

There are no easy answers. Confronting these challenges, reducing the risks to something manageable, survivable is going to require a different model of organization and governance. We'll have to redefine society as we've known it right down to our notions of basic citizenship. That's because we'll never be able to reduce this plethora of risks nearly enough which demands that we also focus on building our resilience, our ability to cope and adapt. You can't do that with social cohesion in tatters as we have today. 

It will take real leadership and vision of a calibre we haven't known for years.

2 comments:

Anonymous said...

When it comes to extinction-level risks there are several. Climate change and nuclear war are 1 and 2.

This is why Noam Chomsky calls the Republican Party "candidates for the most dangerous organization in human history." On climate change, their policies would ensure that our grandchildren have the worst life possible. On nuclear war, they would increasing the risk of it by further militarization.

Cap

The Mound of Sound said...


Neoliberalism seems to have metastasized of late, Cap. Unfortunately it's all headed in one direction, straight into a wall.