Next: About this document ...
Previous: 5. ACKNOWLEDGEMENTS
K. Möller and S. Thrun.
Task modularization by network modulation.
In J. Rault, editor, Proceedings of Neuro-Nimes '90, pages
419-432, November 1990.
A. J. Robinson and F. Fallside.
The utility driven dynamic error propagation network.
Technical Report CUED/F-INFENG/TR.1, Cambridge University Engineering
A fixed size storage time complexity learning algorithm
for fully recurrent continually running networks.
Neural Computation, 4(2):243-248, 1992.
Learning to control fast-weight memories: An alternative to recurrent
Neural Computation, 4(1):131-139, 1992.
An introspective network that can learn to run its own weight change
In Proc. of the Intl. Conf. on Artificial Neural Networks,
Brighton, pages 191-195. IEE, 1993.
A neural network that embeds its own meta-levels.
In Proc. of the International Conference on Neural Networks '93,
San Francisco. IEEE, 1993.
R. J. Williams.
Complexity of exact gradient computation algorithms for recurrent
Technical Report Technical Report NU-CCS-89-27, Boston: Northeastern
University, College of Computer Science, 1989.
R. J. Williams and D. Zipser.
A learning algorithm for continually running fully recurrent
Neural Computation, 1(2):270-280, 1989.
Back to Metalearning page
Back to Recurrent Neural Networks page