The Corona pandemic is putting the world on hold. The only action seems to be on toilet paper hoarding. The way this situation is evolving just goes to show one thing; the world was not prepared. However, the real question is, could we have been prepared? The answer from a strictly rational perspective would of course be yes, but from a human, evolutionary perspective the answer is undoubtedly no.
Early warnings
There has not been any lack of voices warning for the scenario we are now going through. Bill Gates is already back in 2014 did a brilliant TED-talk warning for pretty much everything we are now going through. Most countries have well-funded authorities filled with epidemiologists and other experts to predict and inform about pandemics and even Hollywood contributes to the awareness with movies like Outbreak and Contagion.
Clearly the problem is not lack of information, this boils down to the fact that human beings are trapped in linear thinking, while major shifts, like epidemics, follow a chain reaction pattern with an exponential trajectory.
The nature of chain reactions
On the evening of November 9th, 1989 thousands of East Berlin citizens crossed the checkpoints to West Berlin and were greeted by their western neighbors with champagne and flowers. It seemed to come out of nowhere, sparked by a confused press conference on a not very radical – and temporary – change in travel regulations for East Germans. The East Berlin party leader Günter Schabowski held what turned out to be the most failed press conference ever. He was later quoted with; “The only thing you need to do in order to hold a successful press conference in German Democratic Republic is to speak German and read from a note”. Not quite, as it turned out.
The note he had received from Egon Krenz, the newly elected successor to the long time GDR dictator Eric Honneker did, to Schabowsk´s unfortune, not capture the high power´s intention as clearly as he was used to.
When asked by a journalist when the travel regulations were to come in to effect, Schabowski got a rather confused look on his face, doublechecked the note and replied; “As far as I know… effective immediately… ehh… without delay”. Being used to forty years of total party control of the country, including control of media and journalists, Schabowski probably was comfortable with winging the situation. As history shows, it backfired and when his statement was interpreted by West German radio stations – broadcasting into East Berlin – people made a run on the wall.
In the morning hours of November 9th, not even the brightest analysts or the most well-informed intelligence agencies could predict that the world was at the dawn of a new era. The chain reaction effect of this event is now a historical fact with the collapse of the Soviet Union two years later and a radically different world order following the decades of cold war.
Inherent inability
The human psychology is adapted by millions of years of evolution where linear thinking has served the best purpose for surviving. An ability to connect the shadow of a lion to the urge to flee, was highly rewarded in the natural selection and to this day our brains are hardwired to deal with threats of this kind. Large scale, exponential chain reactions are simply incomprehensible for most humans.
So, if we were this ill prepared for a virus outbreak – the scenario that most experts for decades have claimed being the most likely scenario for global breakdown – what is the likelihood to cope with the threats around the corner?
The threats ahead
The scientific community – and other experts on the topic – is widely scattered in the debate on artificial intelligence, or more specifically artificial general intelligence – AGI – where machines by means of exponential development reach a human level of intelligence. Narrow AI is undoubtedly already here with face recognition, digital assistants and self-driving cars.
Some experts argue that AGI followed by a superintelligence explosion will never materialize, others claim it will create heaven on earth and some raise a warning flag that this might pose a substantial existential risk – as the exponential growth continues out of our control after the point of reaching human level intelligence.
And it is not your average Twitter-tin-foil-hat conspirators making this claim. Max Tegmark, Bill Gates, Nick Bostrom, Elon Musk and Sam Harris are just a few from a large community of experts asking for society to take the potential risk of artificial superintelligence seriously. This calls for an article of its own, but in short the risk is that when an entity becomes vastly more intelligent than humans, it might prioritize its own goals at the cost of humanity – much like humans have prioritized our own goals at the cost of other species, ecology and climate.
I do not expect the politicians of today – who all seem to have been taken by surprise by such a well-documented and predictable threat as a virus pandemic – to have any preparation for this scenario. And if the threat of artificial superintelligence is ever to materialize, toilet paper hoarding is definitely not the rescue.
Leave A Reply