# Relations of APM to purely functional programming

As described elsewhere atomically precise manufacturing (especially when becoming more advanced) will make our physical reality as programmable as the software we have on our computers now. But in particular APM has several overlaps to a special kind of programming called purely functional programming (PFP). There are at least three main classes of programming.

• (impure) imperative programming
• (purely) functional programming
• (pure) logic programming

And thus about the importance of PFP in the development and evolution of APM.

Note that functionalness is neither a necessary nor a sufficient condition for purity. Examples:

• Microsoft Excel ... is pure but not functional (as is OpenSCAD, less known but more relevant for APM)
• Lisp ... is functional but not pure
• Haskell ... is both pure and functional
• C++ ... is neither pure nor functional (at least not functional in any practical usable sense)

For more details see the main page about purely functional programming.

## Purity from immutability of atoms

Physical things (referring to objects out of atoms here) are immutable. They can't be "deleted" (or "overwritten") like data.
(Excluding exotic and impractical means like annihilation with antimatter. See Making things from nothing). They only can be reordered and assembled in a different way. Immutability (preventing destructive overwriting updates) implies purity. So when modelling the flows of matter out of atoms inside advanced productive nanosystems, then
using purely functional programming (PFP) is the natural way to make the make the SW models match the HW reality.

## Purity from nanoscale reversibility

The deepest relationship between APM and PFP (purely functional programming) is that the basic laws of physics all feature reversibility due to symmetry (Nöthers theorem). And the ones that are of practical relevance for the operation of productive nanosystems feature reversibility in time alone. (Side-note: There is the weak nuclear force that requires more than only time reversal to show reversibility, but this is of no obvious practical relevance for advanced productive nanosystems).

So on the smallest scales not only objects out of atoms can't be "deleted" and "overwritten" but information itself cannot be "destroyed". It can only be reversibly transformed.

Irreversibly only becomes possible once the systems get bigger (more complex). Only then one gets mass transformations that constitute a "dispersion" / "thermalization" of information which is effectively what we call "destruction" / "irreversible deletion" of information. Irreversible because reversal lies far beyond our capabilities (and will forever remain so for the largest parts). (Side-note: "forever" is a strong word. This is one of the rare occurrences where its usage seems justified).

So PFP is not only the natural choice for modelling the flow of atomic (and thus immutable) matter at all scales, but also
PFP is the natural choice for modelling the flow of information at the smallest scales.

## Purity form the need for efficiency in computation and matter manipulation

High efficiency requires near reversible operation (including computing) and reversibility implies implies purity. That holds both for computation and manipulation of matter (like e.g. advanced mechanosynthesis). Both provide strong motivation to scale the nanoscale reversible "bubbles" far up into the microscale (and eventually even macroscale).

So PFP is not only the natural choice for modelling the flow of information at the smallest scales but
PFP is also the natural choice for modelling the flow of information at much larger scales.

### Purity from reversible computation

See main articles: Reversible data processing, Reversible computation

### Purity from reversible actuation

See main article: Reversible actuation

To make a nanofactory run one obviously (from common experience, and most fundamental physical knowledge, laws of thermodynamics) has to dissipate/thermalize some minimum amount of energy per time (power) or per operation. If you go to the absolute limits of physically possible efficiency (current 2019 non atomically precise tech is still far from that) then the last remaining factor limiting reachable efficiency levels is the need to prevent the nanofactories inner workings from moveing backwards. you have to dissipate / thermalize just enough energy to sufficiently reliable prevent that. For one possible approach of how to do so see: Dissipation sharing.

(Side-note: Actually from a philosophical perspective it's about having an arrow of time in a universe. The nanofactory being our universe of concern here).

Maybe the manipulation of matter can be seen sort of as a form of computation too (especially in case its done in a reversible fashion).

## Purity as natural property of CSG CAD scene-graphs

• CSG ... constructive solid geometry
• CAD ... computer aided design

Scene-graphs describe how things are instead of giving step-by-step (imperative) instruction-recipes of how to change a growing (hidden) state until one arrives at the final result.
So PFP is the natural choice for a coding language for 3D modelling (especially of the CSG CAD type).

Giving up on purity by changing to a non-pure language is a down-grading to a degree that cannot be understated. A mistake with negative and long lasting symptoms. One can expect an explosion of non-problem-inherent complexity.

## Concurrency

There's hardly an application case imaginable where more of concurrency will be necessary than in advanced productive nanosystems. Concurrency is the one area where PFP sines most. Concurrency is PFPs killer app. There is no barrier to concurrency, parallelism, or GPU programming.

• TODO ...