Abstract
We analyze a discrete-time neural network with continuous state variables updated in parallel. We show that for symmetric connections, the only attractors are fixed points and period-two-limit cycles. We also present a global stability criterion which guarantees only fixed-point attractors by placing limits on the gain (maximum slope) of the sigmoid nonlinearity. The iterated-map network has the same fixed points as a continuous-time analog electronic neural network and converges to an attractor after a small number of iterations of the map.
- Received 27 December 1988
DOI:https://doi.org/10.1103/PhysRevA.40.501
©1989 American Physical Society