Humanity, like over 99% of other species, will become extinct and that may happen within the lifetime of some people already alive. What are the risks that may destroy our civilization and Humanity? In most general terms, these risks could be grouped as global catastrophic or existential (terminal), when classified according to their scope and severity. A “global catastrophic risk” is any risk that is “global” in scope, lasting for some time (endurable) and hence may kill the vast majority of life on earth but humanity could still potentially recover. An “existential” risk on the other hand, is such that its intensity is terminal and its effect is transgenerational. It can destroy all human, non-human and even plant life.
An example of a global catastrophic risk that could destroy the world in a material sense and at the same time potentially eliminate human race is a global nuclear war. It could immediately wipe out most of the current material substance of civilization, i.e. towns, infrastructure, food crops, etc. and in the longer term, through radiation, lack of food and the emergence of post-nuclear winter, causing the death of all remaining people. A global pandemic, is an example of an existential risk, although the COVID-19 virus of the early 2020, which caused indeed a global natural pandemic, could at worst be categorized as a catasrophic event. However existential pandemics may also be caused by an accidental or intentional release from laboratories of a deadly virus, which may wipe out the human race in weeks but leave the infrastructure undamaged, at least for a few years.
Furthermore, both global catastrophic and existential risks could be divided into two broad groups: anthropogenic, which Humanity could to some extent control and mitigate (e.g. global warming), and non-anthropogenic over which we have no control (e.g. asteroid impact).
Some catastrophic risks are natural, such as super-volcanos or an impact of a large asteroid that can cause extinction of many species. Earth has experienced such mass extinctions in the past. This also includes global warming, which in the past was natural and today is anthropogenic i.e. man-made. Other such anthropogenic risks are pandemics caused by artificially created biologically active viruses, or nuclear wars. Perhaps that’s why Nick Bostrom of Future of Humanity Institute believes that human extinction is more likely to result from anthropogenic causes than natural causes.
If our civilization is to survive, we need to apply some powerful risk mitigation strategies. We have hardly any control over the non-anthropogenic (natural) risks. But we do have control over political, social, economic and technological risks. The anthropogenic (man-made) existential risks can be split into three categories, which may require different approaches:
Risks that are immediate and may become existential within days or even in hours:
- Global nuclear war
- Weaponized AI or cyber wars
- Engineered pandemics and synthetic biology
- Nanotechnology and experimental technology accident
- Unknown risks, mainly technology-orientated
Risks that may become existential progressively:
- Climate Change over a long time (at least over a century)
- Superintelligence in a short time (over a decade)
The only way we can avoid it, is an evolution into another species, as some dinosaurs evolved into birds. So, how can humans evolve into another species? I believe it can be done in three stages:
- Control the development of the AI until it matures into a Superintelligence
- Start a period of coexistence with Superintelligence
- Gradually expand the process of transforming humans into Transhumans until they completely morph into Superintelligence
We have started the most uncertain period in the existence of the human kind. You can make your own judgment if this is an exaggeration or an understatement by browsing this website.