Finite histories. Let denote the number of symbols in string x. Let the partial history Ski,j denote the substring between the i-th and the j-th symbol of Ek, j > i. Ski,j is regular (or compressible, or non-random) if the shortest program that computes Ski,j (and nothing else) and halts consists of less than symbols. Otherwise Ski,j is irregular (incompressible, random).
Infinite histories. Similarly, if some universe's evolution is infinite, then it is compressible if it can be computed by a finite algorithm.
Most universes are irregular. The evolutions of almost all universes are incompressible. There are 3n strings of size n, but less than (1/3)c * 3n << 3n algorithms consisting of less than n-c symbols (c is a positive integer). And for the infinite case, we observe: the number of infinite symbol strings is incountable. Only a negligible fraction (namely countably many of them) can be computed by finite programs.
The few regular universes. There are a few compressible universes which can be computed by very short algorithms, though. For instance, suppose that some Uk evolves according to physical laws that tell us how to compute next states from previous states. All we need to compute Uk's evolution is Uk1 and the algorithm that computes Uki+1 from Uki ( ).
Noise? Apparently, we live in one of the few highly regular universes. Each electron appears to behave the same way. Dropped breads of butter regularly hit the floor, not the ceiling. There appear to be deviations from regularity, however, embodied by what we call noise. Although certain macroscopic properties (such as pressure in a gas container) are predictable by physicists, microscopic properties (such as precise particle positions) seem subject to noisy fluctuations. Noise represents additional information absent in the original physical laws. Uniform noise is incompressible -- there is no short algorithm that computes it and nothing else.
Noise does not necessarily prevent compressibility. Laws currently used by physicists to model our own universe model noise. Based on Schrödinger's equation, they are only conditional probability distributions on possible next states, given previous states. The evolution of Schrödinger's wave function (WF) itself can be computed by a very compact algorithm (given the quantizability assumptions in the first paragraph of this paper) -- WF is just a short formula. Whenever WF collapses in a particular way, however, the resulting actual state represents additional information (noise) not conveyed by the algorithm describing the initial state (big bang) and WF. Still, since the noise obviously is non-uniform (due to the nature of the physical laws and WF), our universe's evolution so far is greatly compressible. How? Well, there is a comparatively short algorithm that simply codes probable next states by few bits, and unlikely next states by many bits, as suggested by standard information theory [#!Shannon:48!#].
More regularity than we think? The longer the shortest program computing a given universe, the more random it is. To certain observers, certain universes appear partly random although they aren't. There may be at least two reasons for this:
1. Shortest algorithm cannot be found. It can be shown that there is no algorithm that can generate the shortest program for computing arbitrary given data on a given computer [#!Kolmogorov:65!#,#!Solomonoff:64!#,#!Chaitin:87!#]. In particular, our physicists cannot expect to find the most compact description of our universe.
2. Additional problems of the Heisenberg type. Heisenberg tells us that we cannot even observe the precise, current state of a single electron, let alone our universe. In our particular universe, our actions seem to influence our measurements in a fundamentally unpredictable way. This does not mean that there is no predictable underlying computational process (whose precise results we cannot access). In fact, rules that hold for observers who are part of a given universe and evolved according to its laws need not hold outside of it. There is no reason to believe that the Great Programmer cannot dump a universe and examine its precise state at any given time, just because the creatures that evolved in it cannot because their measurements modify their world.
How much true randomness? Is there ``true'' randomness in our universe, in addition to the simple physical laws? True randomness essentially means that there is no short algorithm computing ``the precise collapse of the wave function'', and what is perceived as noise by today's physicists. In fact, if our universe was infinite, and there was true randomness, then it could not be computed by a finite algorithm that computes nothing else. Our fundamental inability to perceive our universe's state does not imply its true randomness, though. For instance, there may be a very short algorithm computing the positions of electrons lightyears apart in a way that seems like noise to us but actually is highly regular.