next up previous
Next: GENERALIZATION TASKS: SIMULATIONS Up: APPLICATION: FINDING ``SIMPLE'' NEURAL Previous: APPLICATION: FINDING ``SIMPLE'' NEURAL

PREVIOUS ALGORITHMS FOR MAKING NETS ``SIMPLE''

There are also numerous heuristic constructive methods, where network size grows in case of underfitting the training data. MDL approaches in other areas of machine learning include [Quinlan and Rivest, 1989,Gao and Li, 1989,Milosavljevic and Jurka, 1993,Pednault, 1989]. Among the implemented methods, neither the neural net approaches nor the other ones are general in the sense of Solomonoff, Kolmogorov, and Levin. All the previous implementations use measures for ``simplicity'' that lack the universality and elegance of those based on Kolmogorov complexity and algorithmic information theory. Many previous approaches are based on ad-hoc (usually Gaussian) priors.

The remainder of this paper is mostly devoted to simulations of the more general method based on the universal prior, self-sizing programs, and the probabilistic search algorithm favoring candidates with low Levin complexity over candidates with high Levin complexity. With certain seemingly trivial but actually non-trivial toy problems it will be demonstrated that the approach can lead to generalization results unmatchable by more traditional neural net algorithms. It should be mentioned, however, that this does not say much about the applicability of the method to real world tasks.


next up previous
Next: GENERALIZATION TASKS: SIMULATIONS Up: APPLICATION: FINDING ``SIMPLE'' NEURAL Previous: APPLICATION: FINDING ``SIMPLE'' NEURAL
Juergen Schmidhuber 2003-02-12


Back to Optimal Universal Search page
Back to Program Evolution page
Back to Algorithmic Information page
Back to Speed Prior page