PHD STUDENT FELLOWSHIP
We just hired Alex Graves (Cambridge).
Our budget is limited, and
so we sadly were not able to consider several great candidates with
impressive CVs. If you were among them then
I do hope you are not too disappointed now, and I would like to thank you once
more for your efforts, and I wish you all the best for your future carreer!
Juergen Schmidhuber
Note: we expect to have a new very similar job opening in the
near future, with a focus on recurrent nets for robotics.
In case you applied for the position above we'll keep your files.
Otherwise please send a new application like the one
requested in the old announcement below:
We are offering a fellowship for an outstanding PhD student
interested
in state-of-the-art artificial recurrent neural networks (RNNs).
Possible
backgrounds are computer science, physics, mathematics, etc.
The initial appointment would be for 2 years,
starting 2001 or 2002, with possibility of prolongation.
IDSIA is generally methodical and thorough in its professional searches,
and may take several years to fill a position in a targeted field,
so failure to make an appointment in any given year should not be
misinterpreted as a loss of interest in that field.
The new student will interact with
Juergen Schmidhuber
and
Doug Eck
and other people at IDSIA.
(Recently Felix Gers
finished his PhD thesis on RNNs.)
Why RNNs?
They can implement almost arbitrary sequential
behavior. They are biologically more plausible and computationally
more powerful than other adaptive models such as feedforward
networks, Hidden Markov Models, Support Vector Machines, etc.
Making RNNs learn from examples used to be difficult though. A recent
novel RNN called "Long Short-Term Memory" (LSTM)
overcomes problems of traditional RNNs, and efficiently
learns previously unlearnable solutions to numerous tasks, using not
more than O(1) computations per weight and time step: (1) Recognition
of temporally extended, noisy patterns; (2) Recognition of regular and
simple context free and context sensitive languages; (3) Recognition of temporal
order of widely separated events; (4) Extraction of information conveyed
by the temporal distance between events; (5) Generation of precisely
timed rhythms, (6) Stable generation of smooth periodic trajectories;
(7) Robust storage of high-precision real numbers across extended time
intervals; (8) Prediction of chaotic and other time series, (9) Reinforcement
learning in partially observable environments, (10) Metalearning of fast
online learning algorithms.
Goal of the project is to further improve the state-of-the-art in RNN research, and to
apply RNNs to interesting tasks including
music composition and music interpretation.
You might want to check out the following recent publications on RNNs:
S. Hochreiter and J. Schmidhuber.
Long Short-Term Memory.
Neural Computation, 9(8):1735-1780, 1997.
F. A. Gers and J. Schmidhuber and F. Cummins.
Learning to Forget: Continual Prediction with LSTM.
Neural Computation, 12(10):2451--2471, 2000.
F. A. Gers and J. Schmidhuber.
LSTM recurrent networks learn simple context free and context sensitive languages.
IEEE Transactions on Neural Networks, 2001, in press.
Please also find numerous additional publications on LSTM in the
home pages of
Juergen Schmidhuber,
Doug Eck,
and
Felix Gers.
Felix's home page also has pointers to LSTM source code.
SALARY: roughly SFR 35,000 per year. Low taxes.
No teaching etc. - just research for PhD degree.
There is travel funding in case of papers accepted at
important conferences.
Applicants should submit : (i) Detailed curriculum vitae, (ii) List
of three references and their email addresses, (iii) Concise
statement of their research interests (two pages max).
Please send all documents to:
Juergen Schmidhuber,
IDSIA, Galleria 2, 6928 Manno (Lugano), Switzerland.
Applications in plain ASCII format can also be submitted by email
(only small files please) to juergen@idsia.ch.
Do NOT send doc or pdf or large postscript files.
Instead send WWW pointers to postscript files.
Please connect your first and last name by a dot "." in the subject
header, and add a meaningful extension. For instance, if
your name is John Smith, then your messages could have headers
such as:
subject: John.Smith.txt,
subject: John.Smith.cv.txt,
subject: John.Smith.statement.txt,
subject: John.Smith.correspondence.txt....
This will facilitate appropriate filing of your stuff.
Thanks a lot!
ABOUT IDSIA.
Our research focuses on artificial neural nets, reinforcement
learning,
complexity and generalization issues,
unsupervised learning and information theory,
forecasting,
artificial ants,
combinatorial optimization, evolutionary computation.
IDSIA is small but visible, competitive, and influential. IDSIA's algorithms hold the world records for
several important operations research benchmarks (see Nature 406(6791):39-42 for
an overview of artificial ant algorithms developed at IDSIA). In the
"X-Lab Survey" by Business Week magazine,
IDSIA was ranked in fourth place in the category "COMPUTER SCIENCE -
BIOLOGICALLY INSPIRED" - after the Santa Fe Institute,
Stanford University, and EPFL (also in Switzerland).
Its comparatively tiny size notwithstanding, IDSIA also ranked among
the top ten labs
worldwide in the broader category "ARTIFICIAL INTELLIGENCE".
IDSIA is located near the beautiful city of Lugano in Ticino
(pictures),
the scenic southernmost province of Switzerland,
origin of special relativity and the WWW.
Milano,
Italy's center of fashion and finance, is 1 hour away, Venice 3 hours.
Our collaborators at
CSCS (the Swiss supercomputing center) are right beneath us;
we are also affiliated with the University of Lugano and SUPSI.
Switzerland boasts the highest citation impact factor,
the highest supercomputing capacity pc (per capita),
the most Nobel prizes pc (450 % of the US value),
the highest income pc, and perhaps the best chocolate.
Juergen Schmidhuber, director, IDSIA, 2001
juergen@idsia.ch
http://www.idsia.ch/~juergen
Back to