Abstract
We report numerical studies of the ‘‘memory-loss’’ phase transition in Hopfield-like symmetric neural networks in which the neurons are connected to all other neurons within a local neighborhood (dense, short-range connectivity). The number of connections per neuron K scales as the number of neurons N raised to a power less than 1 (i.e., K∼, η<1). We use the recently developed Lee-Kosterlitz finite-size scaling technique to determine the critical value of η below which the first-order phase transition disappears.
- Received 25 October 1991
DOI:https://doi.org/10.1103/PhysRevA.45.6135
©1992 American Physical Society