Perceptron-like learning in time-summating neural networks

and

Published under licence by IOP Publishing Ltd
, , Citation P C Bressloff and J G Taylor 1992 J. Phys. A: Math. Gen. 25 4373 DOI 10.1088/0305-4470/25/16/014

0305-4470/25/16/4373

Abstract

This paper investigates the ability of a single-layer, time-summating neural network to associate and store temporal sequences. In particular, the associative learning of temporal sequences is reformulated as an equivalent classification task involving static patterns. This leads to a generalization of the perceptron learning rule and convergence theorem to the case of temporal sequences. Using geometrical arguments based on linear separability it is shown how a time-summating network can handle temporal features such as ordering and coarticulation effects. Such an ability is a consequence of the fact that the time-summating network develops an activity trace consisting of a decaying sum of all previous inputs to the network. On the other hand, such an activity trace may also lead to an accumulation of errors in the presence of noisy inputs. This motivates a modification of the perceptron learning rule involving the introduction of a stability parameter that guarantees a certain level of robustness to noise. The performance of the network in the presence of random input sequences is then analysed using statistical-mechanical techniques. Finally, it is shown how, with small modifications, the time-summating network can be trained to store and recall complex sequences.

Export citation and abstract BibTeX RIS

10.1088/0305-4470/25/16/014