There is an entire spectrum of ways of ordering the describable things, spanned by two extreme ways of doing it. Sections 2-5 analyzed one of the extremes, based on minimal constructive description size on generalized Turing Machines more expressive than those considered in previous work on Kolmogorov complexity and algorithmic probability and inductive inference. Section 6 discussed the other extreme based on the fastest way of computing all computable things.
Between the two extremes we find methods for ordering describable things by (a) their minimal nonhalting enumerable descriptions (also discussed in Sections 2-5), (b) their minimal halting or monotonic descriptions (this is the traditional theory of Kolmogorov complexity or algorithmic information), and (c) the polynomial time complexity-oriented criteria being subject of most work in theoretical computer science. Theorems in Sections 2-6 reveil some of the structure of the computable and enumerable and constructively describable things.
Both extremes of the spectrum as well as some of the intermediate points yield natural prior distributions on describable objects. The approximable and cumulatively enumerable description size-based priors (Sections 4-5) suggest algorithmic theories of everything (TOEs) partially justifying Occam's razor in a way more general than previous approaches: given several explanations of your universe, those requiring few bits of information are much more probable than those requiring many bits (Section 7). However, there may not be an effective procedure for discovering a compact and complete explanation even if there is one.
The resource-optimal, less dominant, yet arguably more plausible extreme (Section 6) leads to an algorithmic TOE without excessive temporal complexity: no calculation of any universe computable in countable time needs to suffer from an essential slow-down due to simultaneous computation of all the others. Based on the rather weak assumption that the world's creator is constrained by certain limits of computability, and considering that all of us may be just accidental by-products of His optimally efficient search for a solution to some computational problem, the resulting ``speed prior'' predicts that a fast and short algorithm is responsible not only for the apparently simple laws of physics but even for what most physicists currently classify as noise or randomness (Section 7). It may be not all that hard to find; we should search for it.
Much of this paper highlights differences between countable and uncountable sets. It is argued (Sections 6, 7) that things such as uncountable time and space and incomputable probabilities actually should not play a role in explaining the world, for lack of evidence that they are really necessary. Some may feel tempted to counter this line of reasoning by pointing out that for centuries physicists have calculated with continua of real numbers, most of them incomputable. Even quantum physicists who are ready to give up the assumption of a continuous universe usually do take for granted the existence of continuous probability distributions on their discrete universes, and Stephen Hawking explicitly said: ``Although there have been suggestions that space-time may have a discrete structure I see no reason to abandon the continuum theories that have been so successful.'' Note, however, that all physicists in fact have only manipulated discrete symbols, thus generating finite, describable proofs of their results derived from enumerable axioms. That real numbers really exist in a way transcending the finite symbol strings used by everybody may be a figment of imagination -- compare Brouwer's constructive mathematics [#!Brouwer:07!#,#!Beeson:85!#] and the Löwenheim-Skolem Theorem [#!Loewenheim:15!#,#!Skolem:19!#] which implies that any first order theory with an uncountable model such as the real numbers also has a countable model. As Kronecker put it: ``Die ganze Zahl schuf der liebe Gott, alles Übrige ist Menschenwerk'' (``God created the integers, all else is the work of man'' [#!Cajori:19!#]). Kronecker greeted with scepticism Cantor's celebrated insight [#!Cantor:1874!#] about real numbers, mathematical objects Kronecker believed did not even exist.
A good reason to study algorithmic, noncontinuous, discrete TOEs is that they are the simplest ones compatible with everything we know, in the sense that universes that cannot even be described formally are obviously less simple than others. In particular, the speed prior-based algorithmic TOE (Sections 6, 7) neither requires an uncountable ensemble of universes (not even describable in the sense of Def. 6.1), nor infinitely many bits to specify nondescribable real-valued probabilities or nondescribable infinite random sequences. One may believe in the validity of algorithmic TOEs until (a) there is evidence against them, e.g., someone shows that our own universe is not formally describable and would not be possible without, say, existence of incomputable numbers, or (b) someone comes up with an even simpler explanation of everything. But what could that possibly be?
Philosophers tend to create theories inspired by recent scientific developments. For instance, Heisenberg's uncertainty principle and Gödel's incompleteness theorem greatly influenced modern philosophy. Are algorithmic TOEs and the ``Great Programmer Religion'' [#!Schmidhuber:97brauer!#] just another reaction to recent developments, some in hindsight obvious by-product of the advent of good virtual reality? Will they soon become obsolete, as so many previous philosophies? We find it hard to imagine so, even without a boost to be expected for algorithmic TOEs in case someone should indeed discover a simple subroutine responsible for certain physical events hitherto believed to be irregular. After all, algorithmic theories of the describable do encompass everything we will ever be able to talk and write about. Other things are simply beyond description.