For many years I have been an active and staff member of the Lifeboat Foundation, from the time it consisted of only a few members. I have always had a strong affinity to technology and as the potential became more obvious of technologies like nanotechnology, artificial intelligence, computer technology, and biotechnology, to make the human condition better, I have since come to consider myself an extropian or transhumanist. An Extropian embraces the opposite of entropy, which in physics term is the progression into useless disorder that any system without intelligent input tends toward, so the opposite, extropy, is a continual progression into new patterns, order, more information, and general growth and progress.
The coming age of nanotechnology, AI, and biotechnology may prove to be little different than our current age, or it may prove to be so profoundly different that it is almost impossible for us to even conceptualize it (referred to as the ‘singularity’ by advocates) The end of aging, disease, and possibly even death, and certainly starvation, war, and poverty, are all seemingly within the realm of the physical possible. Introduced to the Extropy institute by Skeptic Magazine and always embroiled in scientific skepticism, I always kept in mind that it’s all too easy for a secular humanist and technophobe to extrapolate the possible benefits into actual with fervent certainty, based on merely wishful thinking, leading one to essentially have an attitude about life, death, and immortality with religious tones but disguised in techno babble. Many members of these technophile groups seem guilty of this to me, often ready, like good theists, to sit back and wait for the salvation singularity to come and save us all. The collapse of civilization into the Dark Ages demonstrates clearly enough to me that technological optimism is not guaranteed.
After a long involvement in these organizations, coupled with my skeptical attitude, I became more concerned with the potential threat that some of these technologies could pose. So I soon became an active early supporter, and later employee and staff member, of the Lifeboat Foundation. Rising with the great benefits that may come are some obvious and ominous dangers. Could a runaway self replicating nanotechnological device consume the Earths biosphere and destroy all life on Earth? Could an extremely deadly virus be genetically engineered to target specific ethnic groups with equipment found in any university? Many advocates of technological growth are absolute optimists, seeing no possible way any harm could come from any of these technologies. Others are on the other end of the spectrum, luddites seeking an outright curtailment of all technological growth in these potentially harmful areas. As I have seen the membership grow and hostility to the Lifeboat Foundation decline over the years by active members of these technophilic groups, (in my own small anecdotal assessment) I see a pattern of a more rational caution emerging. While optimistic, it wouldn’t hurt to have a deep understanding of all the possible implementations of potentially harmful technology, and in some cases obvious and simple mechanisms may be put in place to mitigate the chances of any dangerous technology being either intentionally or accidentally released.
As a corollary to this understanding of the risk that these technologies may pose, one can not help but come to an understanding of the entirely natural threats to civilization, and indeed all life on Earth, also face and the dire need to identify these and work to mitigate them. Consider that the last time a caldera volcano erupted on Earth, it likely brought the entire adult human population on the planet down to about 1,000 individuals, the closest humanity has ever come to complete extinction. Everyone is very familiar with asteroid or comet impacts, but little attention is actually paid to identifying these threats and working to mitigate them. Other threats, like a nearby supernovae or a rogue planet or black hole pose very serious threats, as do both natural and unnatural radical climate change (if you think a few feet of water from global warming would be bad, consider over a mile of ice covering most of the cities on Earth) A recent informal poll of Lifeboat Foundation supporters, which now includes over 500 accomplished scientists, authors, futurists, and leading thinkers in their fields, ranked the threat from global warming next to last, just above “Alien Invasion”. Even by the worst estimates made, Global Warming simply is not a civilization killer. Indeed, as far as I am concerned, you have no business holding an opinion about Global Warming and what ought to be done about it without a very clear understanding of ALL the existential threats humanity and life on Earth face and a cohesive prioritization of those threats. The ‘consensus’ from these informed individuals experienced in all existential threats was that the greatest threat we face is a sudden and catastrophic pandemic which wipes our enough life to collapse industrial civilization, leading us into a new dark age which we may never recover from, not global warming.
The most vocal advocates of catastrophic climate change from global warming are very similarly minded to the group I mentioned above, the luddites, which would essentially seek a curtailment of technological growth and ultimately industrial civilization, in order to prevent the threats which might come from new technologies. But such a path of local sustainability and small global populations, while stopping global warming and possibly stopping the threat of new technologies (organizations in secret will still likely pursue these technologies though, but now without oversight) will essentially sentence all life on Earth to certain death. While the threats that new technologies may pose are still unclear, it is VERY clear that the natural environment, from the Earth to the solar system and local area of the galaxy, pose very serious threats which routinely wipe out huge portions of life on Earth (there have been a handful off mass extinctions which typically saw >60% of all species killed in a geological instant) Stopping technological and industrial growth will mean that we can do essentially nothing in the face of the next great cosmic threat - that giant asteroid won’t give a damn what your carbon footprint it. What we need is rapid, rational, industrial and technological growth across the globe, in order to afford and achieve the dispersement of intelligent life throughout the galaxy. Technological optimists, including myself, envision a future in which humanity and intelligent life have spread (seriously reducing the threat that any particular risk poses) and the Earth is essentially cultivated as a giant national park always honored and revered homage as the birthplace of life in the galaxy.
Three principles create much of my strong support the Lifeboat Foundation, you can read about how I argue the Fermi Paradox, Drake Equation and Doomsday curve relate to the Lifeboat Foundation in my post
In recent years the Lifeboat Foundation has experienced tremendous growth and a strong momentum. Where other organizations are tackling specific threats, such as the Singularity Institute (examining the threat that Artificial Intelligence may pose) or the Foresight Institute (to examine the threats that nanotechnology could pose) we work in concert with them, where little attention is being paid to a particular threat, the LF seeks to develop an in depth understanding of those and in all cases work to mitigate the over all threats these things pose. These mitigation strategies may be as simple as ensuring oversight by a free, representative organization over the use of particular technologies, may include manufacturing particular containment facilities to do some of the most dangerous work in, restricting critical information to only the scientists and technical staff that have a legitimate need for it, etc, or as complex as creating vast information archives, underground storage facilities, genetic repositories, or ultimately self contained self sustaining space stations which house an ‘emergency’ population. Ultimately, simple dispersement, decentralization of critical life sustaining systems, and a general robustness (such as multiple independent colonies spread throughout the solar system) would create the most durable civilization possible and should be the ultimate long term goal of anyone concerned with life. In the short term, rational mitigation strategies need to be identified and implemented in response to the threats we face. The Lifeboat Foundation is the only organization with the explicit goal to identify all existential threats, natural or artificial, that humanity, civilization, and life on Earth face and work to mitigate those threats, as such it is one of the most important organizations anyone could support in their own long term rational self interest.
The Lifeboat Foundation has recently sponsored it’s first conference, organized with the Institute for Ethics and Emerging Technologies, entitled “GLOBAL CATASTROPHIC RISKS: Building a Resilient Civilization” It will be hosted in Mountain View California on November 14th. We are looking to raise $2,500 to support this conference, which all ready includes a stellar lineup. Consider supporting the Lifeboat Foundation.
Support the Lifeboat Foundation “Global Catastrophic Risks: Building a Resilient Civilization” conference
The Lifeboat Foundation – Safeguarding Humanity
Official Conference Page
My previous post on Existential Threats
www.matus1976.com - Philosophy, Science, Politics, Art, History
www.ergoslope.com - Ergonomic Add On Desktop