Difference between revisions of "Disaster proof"

From apm
Jump to: navigation, search
m
m
 
(13 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
{{Stub}}
 +
[[File:Sonic checkpoint reached doodle by AlSanya-d5m90pr.png|400px|thumb|right|The point when technology becomes advanced enough to reach a self stabilizing self sustaining state where it becomes as stable or even more stable than life on earth is today. - Image by AlSanya (RetroRobosan)]]
  
 
Means for AP manufacturing have a stabilizing effect on advanced civilization since using them is akin to growing plants from abundant seeds.
 
Means for AP manufacturing have a stabilizing effect on advanced civilization since using them is akin to growing plants from abundant seeds.
Line 8: Line 10:
 
since the systems depended on themselves mutually.
 
since the systems depended on themselves mutually.
 
One could say the whole electric system was in a dynamic state of "alive" always threatening to "die".
 
One could say the whole electric system was in a dynamic state of "alive" always threatening to "die".
 +
 +
Relativating a bit: With advanced APM technology a new dynamically globally alive system on a higher level will very likely emerge,
 +
but the fallback baseline will be much higher averting humanitarian disaster in case of system breakdown.
 +
There would be a sharp drop in standard of living though.
  
 
APM technology in contrast can lie dormant for eons without loosing functionality. <br>
 
APM technology in contrast can lie dormant for eons without loosing functionality. <br>
The main threatening factor for the continued availability of AP systems (not the [[human overpopulation|size of the human population]] - this is an entirely different matter) is destruction from within due to bad too much centralized [[general software issues|software architecture]].
+
The main threatening factor for the continued availability of AP systems (not the [[human overpopulation|size of the human population]] - this is an entirely different matter) is '''destruction from within''' due to bad too much centralized [[general software issues|software architecture]].
 +
Really bad regressions seem plausible but total annihilation rather unlikely.
 +
 
 +
{{speculativity warning}}<br>
 
Further remaining threats reside in the far future and are rather exotic like:
 
Further remaining threats reside in the far future and are rather exotic like:
 
* global radiation exposure from [https://en.wikipedia.org/wiki/Gamma_burst gamma bursts].
 
* global radiation exposure from [https://en.wikipedia.org/wiki/Gamma_burst gamma bursts].
Line 16: Line 25:
 
* ? (insert your favourite SciFi doomsday here)
 
* ? (insert your favourite SciFi doomsday here)
  
[Todo: Discuss importance of avoiding the loss of a bootstrapping path. Documantation and abbreviation of the threaded attainment path such that it could be repeated in the quickest possible way]
+
{{speculativity warning}}<br>
 +
There's the fundamentally unavoidable risk for any information processing existences of walking into long winded dead end in software development
 +
where it is unclear how far "civilization" needs to trace back before continuing onwards becomes possible.
 +
This is strongly tied to:
 +
* Gödels incompleteness or equivalently
 +
* the halting problem(s) or equivalently
 +
* Chaitins construction or equivalently
 +
* the presence of an transfinite number of axioms that are true for no reason
 +
 
 +
{{todo|Discuss the importance of avoiding the loss of a bootstrapping path. Documentation of the historically followed attainment path and identification of possible shortcuts (difficult) such that bootstrapping could be repeated in the quickest possible way.}}
 +
 
 +
== Related ==
 +
 
 +
* [[Desert scenario]]
 +
* pros and cons of extraordinary corrosion resistances - [[Recycling]]
 +
* [[Ultra long term technology stability]] – [[Gem-gum rainforest world]]
 +
 
 +
----
 +
 
 +
'''If you are instead looking for backup of this wiki see page: [[Support]]'''
  
 
[[Category:Technology level III]]
 
[[Category:Technology level III]]
 +
[[Category:Information]]
 +
[[Category:Philosophy]]

Latest revision as of 10:28, 3 June 2023

This article is a stub. It needs to be expanded.
The point when technology becomes advanced enough to reach a self stabilizing self sustaining state where it becomes as stable or even more stable than life on earth is today. - Image by AlSanya (RetroRobosan)

Means for AP manufacturing have a stabilizing effect on advanced civilization since using them is akin to growing plants from abundant seeds. Reaching this level of capability means reaching something like a "game checkpoint". Even if there's no central power source and no infrastructure basic necessities as well as all sorts of luxury goods can be made.

In (2013) there was (and still is) a situation where electric power supply was highly centralized. If there would have been a catastrophe of some sort stopping all systems from working at once there would have been great problems to get it all up and running again since the systems depended on themselves mutually. One could say the whole electric system was in a dynamic state of "alive" always threatening to "die".

Relativating a bit: With advanced APM technology a new dynamically globally alive system on a higher level will very likely emerge, but the fallback baseline will be much higher averting humanitarian disaster in case of system breakdown. There would be a sharp drop in standard of living though.

APM technology in contrast can lie dormant for eons without loosing functionality.
The main threatening factor for the continued availability of AP systems (not the size of the human population - this is an entirely different matter) is destruction from within due to bad too much centralized software architecture. Really bad regressions seem plausible but total annihilation rather unlikely.

Warning! you are moving into more speculative areas.
Further remaining threats reside in the far future and are rather exotic like:

  • global radiation exposure from gamma bursts.
  • planetary collisions
  •  ? (insert your favourite SciFi doomsday here)

Warning! you are moving into more speculative areas.
There's the fundamentally unavoidable risk for any information processing existences of walking into long winded dead end in software development where it is unclear how far "civilization" needs to trace back before continuing onwards becomes possible. This is strongly tied to:

  • Gödels incompleteness or equivalently
  • the halting problem(s) or equivalently
  • Chaitins construction or equivalently
  • the presence of an transfinite number of axioms that are true for no reason

(TODO: Discuss the importance of avoiding the loss of a bootstrapping path. Documentation of the historically followed attainment path and identification of possible shortcuts (difficult) such that bootstrapping could be repeated in the quickest possible way.)

Related


If you are instead looking for backup of this wiki see page: Support