More work on coevolving recurrent neurons:
F. Gomez and J. Schmidhuber.
Coevolving recurrent neurons learn deep memory POMDPs.
In Proc. of the 2005 conference on genetic and
evolutionary computation (GECCO), Washington, D. C.,
pp. 17951802, ACM Press, New York, NY, USA, 2005.
Nominated for Best Paper in Coevolution.
PDF.
Simultaneously evolves networks at two levels of granularity:
full networks and neurons. Applied to
POMDP learning tasks that require to create
shortterm memories of up to thousands of time steps, the method is
faster and simpler than the previous best conventional
reinforcement learning systems.


Related work on fast weights:
J. Schmidhuber. Learning to
control fastweight memories: An alternative to recurrent nets.
Neural Computation, 4(1):131139, 1992.
PDF.
HTML.
Compare pictures (German).
A slowly changing, gradientbased feedforward neural net learns to quickly
manipulate shortterm memory in fast synapses of another net.
More fast weights:
J. Schmidhuber.
Reducing the ratio between learning complexity and number of
timevarying variables in fully recurrent nets.
In Proc. ICANN'93, Amsterdam, pages 460463. Springer, 1993.
PDF.
HTML.
In a certain sense, shortterm memory in fast synapses can be more
efficient than shortterm memory in recurrent connections.
A related coevolution method called COSYNE:
F. Gomez, J. Schmidhuber, R. Miikkulainen.
Accelerated Neural Evolution through
Cooperatively Coevolved Synapses.
Journal of Machine Learning Research (JMLR),
9:937965, 2008.
PDF.
F. Gomez, J. Schmidhuber, and R. Miikkulainen (2006).
Efficient NonLinear Control through Neuroevolution.
Proceedings of the European Conference
on Machine Learning (ECML06, Berlin).
PDF.
A new, general method that outperforms many others
on difficult control tasks.
More recent work of 2013:
Compressed Network Search Finds Complex Neural Controllers with a Million Weights,
learns to drive without a teacher from raw highdimensional video input
 
Related work on evolution for supervised sequence learning:
a new class of learning algorithms for
supervised RNNs, which outperforms
previous methods:
Evolino (2005).
Related work on Compressed Network Evolution(1995):
Many practical algorithms can evolve hundreds of
adaptive parameters, but not millions. Ours can, by evolving compact, compressed descriptions (programs) of huge networks.
J. Koutnik, G. Cuccu, J. Schmidhuber, F. Gomez.
Evolving LargeScale Neural Networks for VisionBased Reinforcement Learning.
In Proceedings of the Genetic and Evolutionary
Computation Conference (GECCO), Amsterdam, 2013.
PDF.
J. Koutnik, F. Gomez, J. Schmidhuber.
Searching for Minimal Neural Networks in Fourier Space.
The 3rd Conference on Artificial General Intelligence (AGI10), 2010.
PDF.
J. Schmidhuber.
Discovering solutions with low Kolmogorov complexity
and high generalization capability.
In A. Prieditis and S. Russell, editors, Machine Learning:
Proceedings of the Twelfth International Conference (ICML 1995),
pages 488496. Morgan
Kaufmann Publishers, San Francisco, CA, 1995.
PDF .
HTML.
Fibonacci web design
by J. Schmidhuber
 
