J. Schmidhuber talk at Zürich Machine Learning and Data Science Meetup
15 April 2014
The talk of J. Schmidhuber about Deep Learning at ETHZ has been published.
J S during the talk
J S during the talk

The talk mentions:

  • the history of backpropagation 1960-1981 and beyond
  • the fundamental Deep Learning (DL) problem of gradient-based neural networks (NNs) (1991)
  • a deep unsupervised stack of recurrent NNs to overcome the DL problem (the History Compressor, 1991)
  • purely supervised deep Long Short-Term Memory RNNs (LSTM) since 1995
  • how LSTM started setting standards in speech & connected handwriting in the new millennium
  • how (in 2010) deep GPU-based backprop (3-5 decades old) + training pattern deformations (2 decades old) broke the MNIST handwriting benchmark record
  • the history of feedforward max-pooling convolutional nets (MPCNNs, 1979, 1989, 1999, 2007, 2011, …)
  • how GPU-based MPCNNs (since 2011) have won many contests where feedforward NNs are applicable: image recognition and segmentation, object detection …
  • how NN-based planning robots won the RoboCup in the fast league (2004)
  • Deep Reinforcement Learning through Compressed NN Search applied to RNN controllers that learn to process raw video input (2013)