More work on coevolving recurrent neurons:
F. Gomez and J. Schmidhuber.
Co-evolving recurrent neurons learn deep memory POMDPs.
In Proc. of the 2005 conference on genetic and
evolutionary computation (GECCO), Washington, D. C.,
pp. 1795-1802, ACM Press, New York, NY, USA, 2005.
Nominated for Best Paper in Coevolution.
Simultaneously evolves networks at two levels of granularity:
full networks and neurons. Applied to
POMDP learning tasks that require to create
short-term memories of up to thousands of time steps, the method is
faster and simpler than the previous best conventional
reinforcement learning systems.
Related work on fast weights:
J. Schmidhuber. Learning to
control fast-weight memories: An alternative to recurrent nets.
Neural Computation, 4(1):131-139, 1992.
Compare pictures (German).
A slowly changing, gradient-based feedforward neural net learns to quickly
manipulate short-term memory in fast synapses of another net.
More fast weights:
Reducing the ratio between learning complexity and number of
time-varying variables in fully recurrent nets.
In Proc. ICANN'93, Amsterdam, pages 460-463. Springer, 1993.
In a certain sense, short-term memory in fast synapses can be more
efficient than short-term memory in recurrent connections.
A related co-evolution method called COSYNE:
F. Gomez, J. Schmidhuber, R. Miikkulainen.
Accelerated Neural Evolution through
Cooperatively Coevolved Synapses.
Journal of Machine Learning Research (JMLR),
F. Gomez, J. Schmidhuber, and R. Miikkulainen (2006).
Efficient Non-Linear Control through Neuroevolution.
Proceedings of the European Conference
on Machine Learning (ECML-06, Berlin).
A new, general method that outperforms many others
on difficult control tasks.
Related work on evolution for supervised sequence learning:
a new class of learning algorithms for
supervised RNNs, which outperforms
Fibonacci web design
by J. Schmidhuber