Big bang as spontaneous demixing event

From apm
Jump to: navigation, search
This article is a stub. It needs to be expanded.
This article is speculative. It covers topics that are not straightforwardly derivable from current knowledge. Take it with a grain of salt. See: "exploratory engineering" for what can be predicted and what not.

All of this might be complete bogus - I strongly advise the reader to treat this just as food for thought!

The big bang was not a point - clearing a common misconception

The big bang was not an "explosion" starting at some "point" in a prepared 3D space as often depicted.

It is well known that when we look out into space we see into the (our) past due to the finite speed of light. The further away the cosmic objects/events that we look at are, the longer ago it was when they existed/happened. We can see only back till the time when our universe cooled down enough to become transparent for light. Since the universe became transparent for light everywhere pretty much at the same time and light-speed is the same in any direction (isotropic), what we see of the universe is a spherical patch. The horizon of that patch is known as the "microwave background". Beyond the horizon of our patch we cannot see. What we can see is actually only this "tiny" spherical patch of the universe centered at our solar system.

This "tiny" spherical patch of the universe that we can see indeed was compressed to almost a "point" 
but this patch is only a tiny part of "the whole universe" which is unknownly or even unknowably bigger.

For the actual "whole universe" including everything outside our horizon (not just the part of the universe that we can see) we do not know of any kind of border. Lack of a border does not mean "infinite" (infinities in physical models usually point to a lack of understanding). The lack of a border more likely means that the concept of space-time looses its meaning. More on that later.

Going backwards in time compressing the "whole universe" (that may have no "border") to densities where space-time looses its meaning too (in the ultra small scale - plank length scale - high dimensional quantum foam ??) does not lead to a point like big bang in some prepared empty space.

Probably a better picture for the big bang is a full-blown border-less universe ... 
(not flat - not 4D space-time - more like high dimensional string theory and beyond)
filled to the brim with "big-bangium". Whatever that is.

Outside the horizon

In the local vicinity of our "tiny" observable patch that we can see one could guess that it would most likely have to look consistent with what we see inside. Sharp changes in character just outside our viewing horizon seem very unlikely but cannot be excluded. E.g. it seems natural to assume continuity of the flat space-time that we observe inside.

But what is much farther out beyond our viewing horizon? Can still anything be said about this mystical "place"?

Borderlessness without infinity

Actually being outside (behind) the backward light cone is pretty much the best isolation to space-time events that one could think of. And quantum mechanics gets the more active the better one isolates an experiment.

So if one contemplates to apply quantum mechanics (that two contradictory things can be genuinely true at the same time) to universal scales then one can think of the far outskirts of the universe (far beyond our visibility horizon) as areas where all possible universes "exist" placeless and timeless jumbled up into - what shall we call it - the multiverse (?) - the heat death (?) - the (???).

In short this thought is about something remotely like quantum superposition at the scale of the whole (unobservable) universe caused by high quality isolation from the light-cone.

Universe a closed system?

When one defines the universe as everything one can possibly theoretically interact with then the universe is an isolated system since impossibility of interaction is equivalent to isolation per definition.

Since the whole universe is expanding faster than the light horizon of the observable universe expands, we are loosing stuff to see (and stuff that can be interacted with) all the time. This can hardly be called a closed system. Judging from this the observable universe is an open system. It's kind of like "evaporating off" hot stuff thereby "cooling and crystallizing" the remaining stuff (e.g. in evaporating water or ions in an optical laser-trap) such a process decreases entropy and kind of would explain why an expanding universe is best for bearing life. (This "evaporating off" is probably a weak analogy though.)

Gravity and Black hole horizons may play the same "evaporating off" role just on the other (fragmented) future end of space-time.

When assuming an isolated universe according to the second law of thermodynamics when looking back in time we must see entropy decreasing. This leaves the big bang as the state with lowest entropy and highest order. By assuming something "before" the big bang (pulsating universe models) one may just defer the problem of where the low entropy stems from. Instead let us check whether the big bang could have been a spontaneous demixing event.

(TODO: How does the above go together?)

Demixing by Poincaret recurrence - small and large

Demixing does not contradict the second law of thermodynamics since it is just a statistical law. Don't worry no perpetual motion machines upcoming - (except the whole multiverse as only exception maybe). Isolated microscopic systems (like e.g. a microscopic gas chamber) that start with a highly ordered state (all gas molecules on one side) will after a while of mixing (entropy increases to the point the molecules have no bias to one side) spontaneously demix (entropy decrease to the point the molecules are on one side again - arbitrarily close to the initial state). This is called the poincaret recurrence theorem. At nano-scales this recurrence happens frequently and is e.g. responsible for thermal jerks that cause jumps of atoms in different lattice points in the slower diffusion processes (e.g. in earths mantle movements over geologic times).

In principle there is no size limit to poincaret demixing recurrence but with every molecule (or whatever other degree of freedom) added the time for the recurrence doubles. At human scales recurrence times already are many many orders of magnitude beyond the age of the universe. It's rather rare that e.g. the air molecules around you "decide" to knock you over. But in principle they can. At universal scales (spontaneous generation of the low entropy state at the big bang) one would expect recurrence times that are even greater.

Except the simulation hypothesis (maybe in unconventional form) is applicable and all of the universe is the consequence of one (or many extensionally equivalent) very compact programs that require only very low dips in entropy.

Lack of information-processing time-perceiving observers makes time "nonexistent"

So why seriously considering a spontaneous demixing event as the explanation for the big bang when recurrence times are mindboggling uber astronomical?

Lets assume that time that is not experienced by information processing conscious observers does not "exist". It is not sampled by any information processing frame-rate.

Any universe in a state nearing complete thermal death (maximal entropy) the "arrow of time" is lost. If one is given a sample of perfect white noise it is indistinguishable from its reverse. In fact all different possible universes become indistinguishable when nearing thermal death.

Without flow of time and without the availability of thermodynamic free energy there is no support for information processing entities and thus there cannot be any time perceiving consciousness.

So whatever ginormous (directionless) pseudo timescales there are "before" a big bang - they are shrunk down to zero. This is closely related to the anthropic principle. We can only observe those parts of the universal "evolution" where the laws allow information processing entities like us to exist. This is also related to what happens before birth and after death of information processing conscious entities like me and you. And why we stay we when we wake up the next day or after a not to severe concussion. Or do we?


Demixing is extremely costly (every additional demixed degree of freedom halves the "probability of existence") so it seems more than a bit likely that most life bearing universes (including ours) do just spent the very bare minimum amount of demixing such that information processing conscious entities become possible. Simple universes are strongly preferred. (Related: fine tuning, Occam's razor)

One may imagine to plot a graph with "universe demixing complexity" on the x-axis and "consciously perceived time in these universes" at the y-axis. Integrating/summing over that infinite distribution gives a finite probability of 1 (normed - an infinite but convergent sum). (wiki-TODO: maybe add some fantasy graphs)

Our universe loves minimalism. It is low dimensional in space (3D) => two eyed life forms. It has a minimal number of stable elements ~100 which is not more than necessary. Judging from that one might suspect that this distribution has a sharp peak at the bare minimum complexity that is just sufficient for the emergence of "life".

Evade the unavoidable?

  • Is fighting thermal death theoretically possible?
    It is known that the order in which quantum measurements are taken can influence the outcome. Is the amount of available demixing of the universe maybe still largely undefined (superposed). (Probably not – more superposition more entropy. There should be a clear association of entopy to quantum system stat - to check.) If so can quantum measurements fixate not yet defined amount of demixing? If so can one maximize entropy reduction by choosing an ideal measurement sequence?


  • While there are much more universes with deeper entropy dips (more bits more combinations) they occur much more rarely. There needs to be some convergent infinite sum.
  • Our universe seems to have a rather minimal entropy dip (3 spacial dimensions; ~100 chemical elements).
  • Maybe "The Unreasonable Effectiveness of Mathemathics in the Natural Sciences" is limited to only the "areas" of the multiverse with sufficient dips in entropy? Do the areas where entropy dips are too small then allow Gödels incompleteness theorem from math to find a correspondence in physics? Such that (not yet known) physics can be just as fundamental as math? But the farther one goes to the from our perspective three borders of the universe in space-time (large scale past, large scale future, small scale timelessness) the more general the models become. How would this go together?


  • quantum computing

External links