Disaster proof

From apm
Revision as of 15:40, 18 October 2017 by Apm (Talk | contribs) (Related: added link to yet unwritten page desert scenario)

Jump to: navigation, search
This article is a stub. It needs to be expanded.
The point when technology becomes advanced enough to reach a self stabilizing self sustaining state where it becomes as stable or even more stable than life on earth is today. - Image by AlSanya (RetroRobosan)

Means for AP manufacturing have a stabilizing effect on advanced civilization since using them is akin to growing plants from abundant seeds. Reaching this level of capability means reaching something like a "game checkpoint". Even if there's no central power source and no infrastructure basic necessities as well as all sorts of luxury goods can be made.

In (2013) there was (and still is) a situation where electric power supply was highly centralized. If there would have been a catastrophe of some sort stopping all systems from working at once there would have been great problems to get it all up and running again since the systems depended on themselves mutually. One could say the whole electric system was in a dynamic state of "alive" always threatening to "die".

APM technology in contrast can lie dormant for eons without loosing functionality.
The main threatening factor for the continued availability of AP systems (not the size of the human population - this is an entirely different matter) is destruction from within due to bad too much centralized software architecture. Really bad regressions seem plausible but total annihilation rather unlikely.

Warning! you are moving into more speculative areas.
Further remaining threats reside in the far future and are rather exotic like:

  • global radiation exposure from gamma bursts.
  • planetary collisions
  •  ? (insert your favourite SciFi doomsday here)

Warning! you are moving into more speculative areas.
There's the fundamentally unavoidable risk for any information processing existences of walking into long winded dead end in software development where it is unclear how far "civilization" needs to trace back before continuing onwards becomes possible. This is strongly tied to:

  • Gödels incompleteness or equivalently
  • the halting problem(s) or equivalently
  • Chaitins construction or equivalently
  • the presence of an transfinite number of axioms that are true for no reason

(TODO: Discuss the importance of avoiding the loss of a bootstrapping path. Documentation of the historically followed attainment path and identification of possible shortcuts (difficult) such that bootstrapping could be repeated in the quickest possible way.)

Related