Duration
1997-
Funding  
SNF

 
Partners  


 
   

 

 

 

 

 

 

   
Long Short -Term Memory (LSTM)
Most work in machine learning focuses on machines with reactive behavior. Recurrent neural networks or RNNs, however, are more general sequence processors inspired by human brains. They have adaptive feedback connections and are in principle as powerful as any computer. Until recently, however, RNNs could not learn to look far back into the past. But our novel RNN called "Long Short-Term Memory" (LSTM) overcomes the fundamental problems of traditional RNNs, and efficiently learns to solve many previously unlearnable tasks, including: Recognition of certain context sensitive languages; Reinforcement learning in partially observable environments; Metalearning of fast online learning algorithms; Music composition.
People Douglas Eck (Fred Cummins)
Alex Graves (Felix Gers)
Coordinator
Juergen Schmidhuber

 

 

 
Copyright © 2003 - 2012 IDSIA. All Rights Reserved.