Difference between revisions of "Data decompression chain"

From apm
Jump to: navigation, search
m (Related)
(added section == Bootstrapping of the decompression chain ==)
Line 28: Line 28:
 
* example: OpenSCAD
 
* example: OpenSCAD
 
...
 
...
 
  
 
== Similar situations in today's computer architectures ==
 
== Similar situations in today's computer architectures ==
  
* high level language -> compiler infrastructure (e.g. llvm) -> assembler language -> actual actions of the target data processing machine
+
* high level language ->  
 +
* compiler infrastructure (e.g. llvm) ->  
 +
* assembler language ->  
 +
* actual actions of the target data processing machine
 +
 
 +
== Bootstrapping of the decompression chain ==
 +
 
 +
One of the [[common misconceptions|many flawed critique points of APM]]
 +
is that all the necessary data cannot be fed to the [[mechanosynthesis core]]s and [[crystolecule assembly robotics]] (the former are mostly hard coded and don't need much data by the way).
 +
 
 +
For example this size comparison in [https://youtu.be/Q9RiB_o7Szs?t=13m35s E. Drexlers TEDx talk (2015) 13:35]
 +
can (if taken to literally) can lead to the misjudgment that there is an fundamentally insurmountable data bottleneck. Of course feeding yotabit per second over those few pins is ridiculous but that is not what is planned.
 +
 
 +
We already know how to avoid such a bottleneck.
 +
Albeit we program computers with our fingers delivering just a few bits per second computers now perform petabit per second internally.
 +
 
 +
The goal is reachable by gradually building up a hierarchy of decompression steps.
 +
The most low level most high volume data is generated internally and locally very near where it's finally "consumed".
  
 
== Related ==
 
== Related ==
 +
 
* [[Control hierarchy]]
 
* [[Control hierarchy]]
 
* mergement of GUI-IDE & code-IDE
 
* mergement of GUI-IDE & code-IDE

Revision as of 06:52, 10 June 2017

This article is a stub. It needs to be expanded.

This article defines a novel term (that is hopefully sensibly chosen). The term is introduced to make a concept more concrete and understand its interrelationship with other topics related to atomically precise manufacturing. For details go to the page: Neologism.

The data decompression chain is the sequence of expansion steps from

  • very compact highest level abstract blueprints of technical systems to
  • discrete and simple lowest level instances that are much larger in size.

3D modelling

(TODO: add details)

  • high language 1: functional, logical, connection to computer algebra system
  • high language 2: imperative, functional
  • constructiv esolid geometry graph (CSG graph), parametric surfaces
  • quadrik nets C1
  • triangle nets C0
  • toolpaths
  • Primitive signals: step-signals, rail-switch-states, clutch-states, ...

Targets

  • physical object
  • virtual simulation

CSG Scenegraphs & functional programming

Modeling of static 3D models is purely declarative.

  • example: OpenSCAD

...

Similar situations in today's computer architectures

  • high level language ->
  • compiler infrastructure (e.g. llvm) ->
  • assembler language ->
  • actual actions of the target data processing machine

Bootstrapping of the decompression chain

One of the many flawed critique points of APM is that all the necessary data cannot be fed to the mechanosynthesis cores and crystolecule assembly robotics (the former are mostly hard coded and don't need much data by the way).

For example this size comparison in E. Drexlers TEDx talk (2015) 13:35 can (if taken to literally) can lead to the misjudgment that there is an fundamentally insurmountable data bottleneck. Of course feeding yotabit per second over those few pins is ridiculous but that is not what is planned.

We already know how to avoid such a bottleneck. Albeit we program computers with our fingers delivering just a few bits per second computers now perform petabit per second internally.

The goal is reachable by gradually building up a hierarchy of decompression steps. The most low level most high volume data is generated internally and locally very near where it's finally "consumed".

Related

  • Control hierarchy
  • mergement of GUI-IDE & code-IDE
  • The reverse: while decompressing is a technique compressing is an art - (a vague analog to derivation and integration)
    See: the source of new axioms Warning! you are moving into more speculative areas.
  • In the case of synthesis of food the vastly different decompression chain between biological systems and advanced diamondoid nanofactories leads to the situation that nanofactories cannot synthesize exact copies of food down to the placement of every atom. See Food structure irrelevancy gap for a viable alternative.
  • constructive corecursion