CONTRIBUTED ARTICLEInformational Capacity and Recall Quality in Sparsely Encoded Hopfield-like Neural Network: Analytical Approaches and Computer Simulation
Section snippets
INTRODUCTION
Some estimations of informational capacity and recall quality of sparsely encoded Hopfield-like associative memory have been made by Amari (1989), Buhmann et al. (1989), Horner (1989), Perez-Vicente and Amit (1989), Frolov et al. (1991) and others. Encoding is called sparse if the number of active neurons n in stored patterns is small compared with the total number of neurons in the network N. It was shown that sparseness results in an increase of informational capacity of the network not only
MODEL DESCRIPTION
There are two main phases of neural network operation: learning (encoding) and recall (decoding).
Single-step (SS) approximation
This approximation is used for the analysis of the neural network dynamics in the synchronous mode. It has been proposed by Kinzel (1985) for the case p = 0.5 and for bipolar neurons activities (Xi ∈ {− 1,1}). It has been shown by other analytical techniques (Amit et al., 1987; Amari and Maginu, 1988) and by computer simulation (Hopfield, 1982; Amit et al., 1987; Kohring, 1990) that single-step approximation is very inaccurate for this case. That is why it has not been applied as a technique
Informational capacity
To calculate αcr by computer simulation, we evaluated the probability P that a given prototype has a stable state in its vicinity. To do this, following Amit et al. (1987), Kohring (1990) and Husek and Frolov (1994), we used prototypes themselves as initial network states. Only synchronous dynamics have been analysed in computer simulation. The recall process stops when the activity either reaches a fixed point, a cycle of length from 2 to 10, or if the number of time steps exceeds 100. A
CONCLUDING REMARKS
In the above study we investigated analytically and by computer simulation the informational capacity αcr and recall quality mf of sparsely encoded Hopfield-like auto-associative memory. The results of computer simulation, which were extrapolated for the case N → ∞, were in good agreement with those obtained by the replica method (RM) and by the method of statistical neurodynamics (SN). It shows that these techniques are sufficiently accurate and that extrapolating formulae are applied to
Acknowledgements
This work was carried out under a grant from the Grant Agency of Czech Republic, Prague No. 201/94/0729 and the grant of Russian Fund of Fundamental Researches, No. 94-04-11771.
References (17)
Characteristics of sparsely encoded associative memory
Neural Networks
(1989)- et al.
Statistical neurodynamics of associative memory
Neural Networks
(1988) - et al.
Statistical mechanics of neural networks near saturation
Annals of Physics
(1987) - et al.
Statistical analysis of the dynamics of a sparse associative memory
Neural Networks
(1992) - et al.
Associative memory with high information content
Physical Review A
(1989) - et al.
Informational characteristics of neural networks capable of associative learning based on Hebbian plasticity
Network
(1993) - et al.
Imitation model of associative memory as a neuron network with low activity level
Biofizika
(1991) The space of interactions in neural network models
Journal of Physics A
(1988)
Cited by (26)
Design and analysis of a noise-suppression zeroing neural network approach for robust synchronization of chaotic systems
2021, NeurocomputingCitation Excerpt :Originally invented as a numerical calculation tool, the Hopfield-like neural network (HNN) [1] has attracted a large number of researchers and has been widely used in solving linear problems [2,3], solving Sylvester equations [4], complex matrix inversion [5], quadratic programming problems [6–8] and other nonlinear problems [9–11].
Building a world model with structure-sensitive sparse binary distributed representations
2013, Biologically Inspired Cognitive ArchitecturesCitation Excerpt :Networks with solely excitatory connections and neurobiologically relevant low “activity level” (that corresponds to the sparse representations of SBDRs) appeared to be promising models of distributed associative memory. They provide high storage capacity (e.g., Amit, 1989; Frolov, Husek, & Muraviev, 1997; Frolov & Murav’ev, 1993; Knoblauch, Palm, & Sommer, 2010; Palm, 1982; Rachkovskij, 1990; Tsodyks, 1989; Willshaw, Buneman, & Longuet-Higgins, 1969). For example, auto-associative versions of such memories allow a fast storage and recall of much more than N independent random codevectors by their noisy partial versions.
Attractor neural networks with patchy connectivity
2006, NeurocomputingDesign and Analysis of a Novel Distributed Gradient Neural Network for Solving Consensus Problems in a Predefined Time
2024, IEEE Transactions on Neural Networks and Learning SystemsAnalysis of oscillating processes in spiking neural networks
2023, European Physical Journal: Special Topics