Our recent research generalized Solomonoff's approach to the case of less restrictive nonenumerable universal priors that are still computable in the limit [50,52].
An object
is formally describable if a finite amount of information
completely describes
and only
. More to the point,
should be
representable by a possibly infinite bitstring
such that there is a
finite, possibly never halting program
that computes
and nothing
but
in a way that modifies each output bit at most finitely many
times; that is, each finite beginning of
eventually converges
and ceases to change. This constructive notion of formal describability
is less restrictive than the traditional notion of computability
[67], mainly because we do not insist on the existence
of a halting program that computes an upper bound of the convergence
time of
's
-th output bit. Formal describability thus pushes
constructivism [5,1] to the extreme, barely
avoiding the nonconstructivism embodied by even less restrictive
concepts of describability (compare computability in the limit
[17,40,14] and
-describability
[42][31, p. 46-47]).
The traditional theory of inductive inference focuses on Turing machines
with one-way write-only output tape. This leads to the universal enumerable
Solomonoff-Levin (semi) measure. We introduced more general, nonenumerable,
but still limit-computable measures and
a natural hierarchy of generalizations of algorithmic
probability and Kolmogorov complexity [50,52],
suggesting that the ``true''
information content of some (possibly infinite) bitstring
actually
is the size of
the shortest nonhalting program that converges to
and nothing but
on
a Turing machine that can edit its previous outputs. In fact,
this ``true'' content is often smaller than the traditional Kolmogorov complexity.
We showed that there are Super Omegas computable in the limit yet more
random than Chaitin's ``number of wisdom'' Omega [9] (which
is maximally random in a weaker traditional sense),
and that any approximable
measure of
is small for any
lacking a short description.
We also showed that there is a
universal cumulatively enumerable measure of
based on the measure of
all enumerable
lexicographically greater than
.
It is more dominant yet
just as limit-computable as Solomonoff's [52].
That is, if we are interested in limit-computable universal measures,
we should prefer the novel universal cumulatively enumerable measure
over the traditional enumerable one.
If we include in our Bayesmix such limit-computable distributions
we obtain again sharp loss bounds for prediction based on
the mix [50,52].
Our approach highlights differences between countable and uncountable sets. Which are the potential consequences for physics? We argue that things such as uncountable time and space and incomputable probabilities actually should not play a role in explaining the world, for lack of evidence that they are really necessary [50]. Some may feel tempted to counter this line of reasoning by pointing out that for centuries physicists have calculated with continua of real numbers, most of them incomputable. Even quantum physicists who are ready to give up the assumption of a continuous universe usually do take for granted the existence of continuous probability distributions on their discrete universes, and Stephen Hawking explicitly said: ``Although there have been suggestions that space-time may have a discrete structure I see no reason to abandon the continuum theories that have been so successful.'' Note, however, that all physicists in fact have only manipulated discrete symbols, thus generating finite, describable proofs of their results derived from enumerable axioms. That real numbers really exist in a way transcending the finite symbol strings used by everybody may be a figment of imagination -- compare Brouwer's constructive mathematics [5,1] and the Löwenheim-Skolem Theorem [32,61] which implies that any first order theory with an uncountable model such as the real numbers also has a countable model. As Kronecker put it: ``Die ganze Zahl schuf der liebe Gott, alles Übrige ist Menschenwerk'' (``God created the integers, all else is the work of man'' [6]). Kronecker greeted with scepticism Cantor's celebrated insight [7] about real numbers, mathematical objects Kronecker believed did not even exist.
Assuming our future lies among the few (countably many) describable futures, we can ignore uncountably many nondescribable ones, in particular, the random ones. Adding the relatively mild assumption that the probability distribution from which our universe is drawn is cumulatively enumerable provides a theoretical justification of the prediction that the most likely continuations of our universes are computable through short enumeration procedures. In this sense Occam's razor is just a natural by-product of a computability assumption! But what about falsifiability? The pseudorandomness of our universe might be effectively undetectable in principle, because some approximable and enumerable patterns cannot be proven to be nonrandom in recursively bounded time.
The next sections, however, will introduce additional plausible assumptions that do lead to computable optimal prediction procedures.