next up previous
Next: Which is the ``True'' Up: Expressiveness of EOMs and Previous: EOMs dominate MTMs

GTMs dominate EOMs -- novel objects less regular than $Omega$

We will now show that there are describable strings that are constructively computable in the limit and have a short GTM description yet are ``even more random'' than Chaitin's Omegas, in the sense that even on EOMs they do not have any compact descriptions -- contrast this with Kurtz's and Kucera's nonconstructive randomness in the arithmetic hierarchy [27,26].

Theorem 3.3   For all $n$ there are $z in B^*$ with

\begin{displaymath}
K^E(z) > n - O(1),  yet   K^G(z) leq O(log n).
\end{displaymath}

That is, $K^E(z) - K^G(z)$ is unbounded.

Proof. For $x in B^* backslash \{ lambda \}$ and universal EOM $T$ define

\begin{displaymath}
\Xi(x) =
\sum_{y \in B^{\sharp}: 0.y > 0.x}   
\sum_{p: T(p) \leadsto y} 2^{-l(p)}.
\end{displaymath} (11)

The reader should not worry about the sum over uncountably many objects, because the dyadic expansion of $Xi(x)$ is EOM-computable or c.e. The algorithm works as follows:
Algorithm A: Initialize the real-valued variable $V$ by 0, run all possible programs of EOM $T$ dovetail style such that the $n$-th step of the $i$-th program is executed in the $n+i$-th phase; whenever the output of a program prefix $q$ starts with some $y$ satisfying $0.y > 0.x$ for the first time, set $V:= V+
2^{-l(q)}$; henceforth ignore continuations of $q$.
$V$ approximates $Xi(x)$ from below in c.e. fashion -- infinite $p$ are not worrisome as $T$ must only read a finite prefix of $p$ to observe $0.y > 0.x$ if the latter holds indeed. We will now show that knowledge of $Xi(x)_n$, the first $n$ bits of $Xi(x)$, allows for constructing a bitstring $z$ with $K^E(z) geq n - O(1)$ when $x$ has low complexity.

Suppose we know $Xi(x)_n$. Once algorithm A above yields $V >
Xi(x)_n$ we know that no programs $p$ with $l(p) < n$ will contribute any more to V. Choose the shortest $z$ satisfying $0.z = (0.y_{min} - 0.x)/2$, where $y_{min}$ is the lexicographically smallest $y$ previously computed by algorithm A such that $0.y > 0.x$. Then $z$ cannot be among the strings T-describable with fewer than $n$ bits. Using the Invariance Theorem (compare Proposition 3.1) we obtain $K^E(z) geq n - O(1)$.

While prefixes of $Omega$ are greatly compressible on EOMs, $z$ is not. On the other hand, $z$ is compactly G-describable: $K^G(z) leq K(x) +
K(n) + O(1)$. For instance, choosing a low-complexity $x$, we have $K^G(z) leq O(K(n)) leq O(log n)$.



The above construction shows that the novel objects are ``just as computable in the limit'' as $Omega$, despite being more random. An interesting topic of future research may be to identify the most random describable object, if there is one, which we doubt.

The discussion above also reveals a natural complexity hierarchy. Ignoring additive constants, we have

\begin{displaymath}
K^G(x) \leq K^E(x) \leq K^M(x),
\end{displaymath} (12)

where for each ``$leq$'' relation above there are $x$ which allow for replacing ``$leq$'' by ``$<$.''


next up previous
Next: Which is the ``True'' Up: Expressiveness of EOMs and Previous: EOMs dominate MTMs
Juergen Schmidhuber 2003-02-13