Next: Tighter Bounds? Up: Probability vs Descriptive Complexity Previous: Probability vs Descriptive Complexity

## Novel Theorems for EOMs and GTMs

Theorem 5.3   For with ,
 (36)

Using for interpreted as an integer -- compare Def. 2.8 -- this yields
 (37)

That is, objects that are hard to describe (in the sense that they have only long enumerating descriptions) have low probability.

Proof. The left-hand inequality follows by definition. To show the right-hand side, one can build an EOM that computes using not more than input bits in a way inspired by Huffman-Coding [22]. The claim then follows from the Invariance Theorem. The trick is to arrange 's computation such that 's output converges yet never needs to decrease lexicographically. works as follows:

(A) Emulate to construct a real c.e. number encoded as a self-delimiting input program , simultaneously run all (possibly forever running) programs on dovetail style; whenever the output of a prefix of any running program starts with some for the first time, set variable (if no program has ever created output starting with then first create initialized by 0); whenever the output of some extension of (obtained by possibly reading additional input bits: if none are read) lexicographically increases such that it does not equal any more, set .

(B) Simultaneously, starting at the right end of the unit interval , as the are being updated, keep updating a chain of disjoint, consecutive, adjacent, half-open (at the right end) intervals of size in alphabetic order on , such that the right end of the of the largest coincides with the right end of , and is to the right of if . After every variable update and each change of , replace the output of by the of the with .

This will never violate the EOM constraints: the c.e. cannot shrink, and since EOM outputs cannot decrease lexicographically, the interval boundaries and cannot grow (their negations are c.e., compare Lemma 4.1), hence 's output cannot decrease.

For the converge towards an interval of size . For with , we have: for any there is a time such that for all time steps in 's computation, an interval of size will be completely covered by certain satisfying and . So for the also converge towards an interval of size . Hence will output larger and larger approximating from below, provided .

Since any interval of size within contains a number with = , in both cases there is a number (encodable by some satisfying ) with = , such that , and therefore .

Less symmetric statements can also be derived in very similar fashion:

Theorem 5.4   Let TM induce approximable for all (compare Defs. 4.10 and 4.12; an EOM would be a special case). Then for , :
 (38)

Proof. Modify the proof of Theorem 5.3 for approximable as opposed to c.e. interval boundaries and approximable .

A similar proof, but without the complication for the case , yields:

Theorem 5.5   Let denote an approximable semimeasure on ; that is, is describable. Then for :
 (39)

 (40)

As a consequence,
 (41)

Proof. Initialize variables and . Dovetailing over all , approximate the GTM-computable in variables initialized by zero, and create a chain of adjacent intervals analogously to the proof of Theorem 5.3.

The converge against intervals of size . Hence is GTM-encodable by any program producing an output with : after every update, replace the GTM's output by the of the with . Similarly, if is in the union of adjacent intervals of strings starting with , then the GTM's output will converge towards some string starting with . The rest follows in a way similar to the one described in the final paragraph of the proof of Theorem 5.3.

Using the basic ideas in the proofs of Theorem 5.3 and 5.5 in conjunction with Corollary 4.3 and Lemma 4.2, one can also obtain statements such as:

Theorem 5.6   Let denote the universal CEM from Theorem 4.1. For ,
 (42)

While dominates and dominates , the reverse statements are not true. In fact, given the results from Sections 3.2 and 5, one can now make claims such as the following ones:

Corollary 5.1   The following functions are unbounded:

Proof. For the cases and , apply Theorems 5.2, 5.6 and the unboundedness of (10). For the case , apply Theorems 3.3 and 5.3.

Next: Tighter Bounds? Up: Probability vs Descriptive Complexity Previous: Probability vs Descriptive Complexity
Juergen Schmidhuber 2003-02-13