At a given time, the `next' symbol emitted
by a very simple stochastic `language generator'
was not precisely predictable from the `previous' symbol
but belonged to a certain
class defined by the previous symbol.
During training, at a given time
saw the previous
symbol while
saw the next symbol.
minimized (3),
minimized (2)
with
defined according to
equations (5) and (6).
10 test runs with
15,000 training iterations were conducted.
always learned to emit different localized
representations in response to members of predictable classes,
while superfluous output units remained switched
off.