Elsevier

Neural Networks

Volume 10, Issue 5, July 1997, Pages 845-855
Neural Networks

CONTRIBUTED ARTICLE
Informational Capacity and Recall Quality in Sparsely Encoded Hopfield-like Neural Network: Analytical Approaches and Computer Simulation

https://doi.org/10.1016/S0893-6080(96)00122-0Get rights and content

Abstract

A sparsely encoded Hopfield-like attractor neural network is investigated analytically and by computer simulation. Informational capacity and recall quality are evaluated. Three analytical approaches are used: replica method (RM); method of statistical neurodynamics (SN); and single-step approximation (SS). Computer simulation confirmed the good accuracy of RM and SN for all levels of network activity. SS is accurate only for large sparseness. It is shown that informational capacity monotonically increases when sparseness increases, while recall quality changes nonmonotonically: initially it decreases and then increases. Computer simulation revealed the main features of network behaviour near the saturation which are not predicted by the used analytical approaches. © 1997 Elsevier Science Ltd.

Section snippets

INTRODUCTION

Some estimations of informational capacity and recall quality of sparsely encoded Hopfield-like associative memory have been made by Amari (1989), Buhmann et al. (1989), Horner (1989), Perez-Vicente and Amit (1989), Frolov et al. (1991) and others. Encoding is called sparse if the number of active neurons n in stored patterns is small compared with the total number of neurons in the network N. It was shown that sparseness results in an increase of informational capacity of the network not only

MODEL DESCRIPTION

There are two main phases of neural network operation: learning (encoding) and recall (decoding).

Single-step (SS) approximation

This approximation is used for the analysis of the neural network dynamics in the synchronous mode. It has been proposed by Kinzel (1985) for the case p = 0.5 and for bipolar neurons activities (Xi ∈ {− 1,1}). It has been shown by other analytical techniques (Amit et al., 1987; Amari and Maginu, 1988) and by computer simulation (Hopfield, 1982; Amit et al., 1987; Kohring, 1990) that single-step approximation is very inaccurate for this case. That is why it has not been applied as a technique

Informational capacity

To calculate αcr by computer simulation, we evaluated the probability P that a given prototype has a stable state in its vicinity. To do this, following Amit et al. (1987), Kohring (1990) and Husek and Frolov (1994), we used prototypes themselves as initial network states. Only synchronous dynamics have been analysed in computer simulation. The recall process stops when the activity either reaches a fixed point, a cycle of length from 2 to 10, or if the number of time steps exceeds 100. A

CONCLUDING REMARKS

In the above study we investigated analytically and by computer simulation the informational capacity αcr and recall quality mf of sparsely encoded Hopfield-like auto-associative memory. The results of computer simulation, which were extrapolated for the case N → ∞, were in good agreement with those obtained by the replica method (RM) and by the method of statistical neurodynamics (SN). It shows that these techniques are sufficiently accurate and that extrapolating formulae are applied to

Acknowledgements

This work was carried out under a grant from the Grant Agency of Czech Republic, Prague No. 201/94/0729 and the grant of Russian Fund of Fundamental Researches, No. 94-04-11771.

References (17)

There are more references available in the full text version of this article.

Cited by (26)

  • Design and analysis of a noise-suppression zeroing neural network approach for robust synchronization of chaotic systems

    2021, Neurocomputing
    Citation Excerpt :

    Originally invented as a numerical calculation tool, the Hopfield-like neural network (HNN) [1] has attracted a large number of researchers and has been widely used in solving linear problems [2,3], solving Sylvester equations [4], complex matrix inversion [5], quadratic programming problems [6–8] and other nonlinear problems [9–11].

  • Building a world model with structure-sensitive sparse binary distributed representations

    2013, Biologically Inspired Cognitive Architectures
    Citation Excerpt :

    Networks with solely excitatory connections and neurobiologically relevant low “activity level” (that corresponds to the sparse representations of SBDRs) appeared to be promising models of distributed associative memory. They provide high storage capacity (e.g., Amit, 1989; Frolov, Husek, & Muraviev, 1997; Frolov & Murav’ev, 1993; Knoblauch, Palm, & Sommer, 2010; Palm, 1982; Rachkovskij, 1990; Tsodyks, 1989; Willshaw, Buneman, & Longuet-Higgins, 1969). For example, auto-associative versions of such memories allow a fast storage and recall of much more than N independent random codevectors by their noisy partial versions.

  • Analysis of oscillating processes in spiking neural networks

    2023, European Physical Journal: Special Topics
View all citing articles on Scopus
View full text