Abstract
An overview of the basic results in complexity theory of discrete neural computations is presented. Especially, the computational power and efficiency of single neurons, neural circuits, symmetric neural networks (Hopfield model), and of Boltzmann machines is investigated and characterized. Corresponding intractability results are mentioned as well. The evidence is presented why discrete neural networks (inclusively Boltzmann machines) are not to be expected to solve intractable problems more efficiently than other conventional models of computing.
This work was finished while the author was visiting the Department of Computer Science, University of Saarland, West Germany (Spring 1990). During this stay the research was partially supported by the ESPRIT II Basic Research Action Program of the EC under contract No. 3075 (Project Alcom).
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
Adlemann, L.: Two Theorems on Random Polynomial Time, Proc. 19-th FOCS, Washington D. C., 1978, pp. 75–83
Ackley, D. N. — Hinton, G. E. — Sejnowski, T.I.: A Learning Algorithm for Boltzmann Machines. Cognitive Science 9, 1985, pp. 147–169
Barahona, F.: On the Computational Complexity of Ising Spin Glass Models. J. Phys. A. 15, 1982, pp. 3241–3253
Bruck, J. — Goodman, J. W.: A Generalized Convergence Theorem for Neural Networks and Its Application in Combinatorial Optimization. Proc. IEEE First International Conf. on Neural Networks, Vol. 3, 1987, pp.649–656
Chandra, A. K. — Kozen, D. C. — Stockmeyer, L. I.: Alternation. JACM 28, 1981, pp. 114–133
Chandra, A. K. — Stockmeyer, L. I. — Vishkin, U.: Constant Depth Reducibility. SIAM J. Comput. Vol. 15, No. 3, 1984, pp. 423–432
Egecioglu, O. — Smith, T.R. — Moody, I.: Computable Functions and Complexity in neural networks. Tech. Rep. ITP-124, University of California, Santa Barbara, 1986
Farhat, N. H. — Psaltis, D. — Prata, A. — Paek, E.: Optical Implementation of the Hopfield Model. Applied Optics, 24, 1985, pp. 1469–1475
Faigle, U. — Schrader, R.: On the Convergence Of Stationary Distributions in Simulated Annealing Algorithms. Inf. Proc. Letters, 27, 1988, pp. 189–194
Feldman, J. A.: Energy and the Behavior of Connectionist Models. Tech. Rep. TR-155, University of Rochester, Nov. 1985
Garey, M. R. — Johnson, D. S.: Computers and Intractability. A Guide to the Theory of NP-Completeness. Freeman and Co., San Francisco, 1979
Hopfield, J. J.: Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proc. Natl. Acad. Sci. USA 79, 1982, pp. 2554–2558
Hopfield, J. J.: Neurons with Graded Response Have Collective Computational Properties Like Those of Two-state Neurons. Proc. Natl. Acad. Sci. USA, 1984, pp. 3088–3092
Hopfield, J. J.: The Effectiveness of Neural Computing. Proc. IFIP'89, North-Holland, 1989, pp. 503–507
Hopfield, J. J. — Tank, D. W.: 'Neural’ Computations of Decisions in Optimization Problems. Biol. Cybern. 52, 1985, pp. 141–152
Hopfield, J. J. — Tank, D. W.: Computing with Neural Circuits: A Model. Science 233, 1986, pp.625–633
Johnson, J. l.: A Neural Network Approach to the 3-Satisfiability Problem. J. of Parall. and Distrib. Comput. 6, 1989, pp. 435–449
Kirkpatrick, S. — Gellat, C. D., Jr. — Vecchi, M. P.: Optimization by Simulated Annealing. Science, 220, No. 4598, 1983
Korst,. J. H. M. — Arts, E. H. L.: Combinatorial Optimization on a Boltzmann Machine. J. of Parall. and Distrib. Comput. 6, 1989, pp. 331–357
Metropolis, N. — Rosenbluth, A. — Rosenbluth, M. — Teller, A. — Teller, E.: J. Chem. Phys., 21, 1087, 1953
Minsky, M.: Computation. Finite and Infinite Machines. Prentice Hall, Englewood Cliffs, NJ, 1967
Muroga, S.: Threshold Logic and Its Applications. Wiley-Interscience, New York, 1971
Minsky, M. — Papert, S.: Perceptrons. An Introduction to Computational Geometry. The MIT Press, Cambridge, Mass., 1969
Muroga, S. — Tsubi, T. — Baugh, Ch. R.: Enumeration of Threshold Functions of Eight Variables. IEEE Trans. on Comp., C-19, No. 9, 1970, pp. 818–825
Parberry, I.: A Primer on the Complexity Theory of Neural Networks. Research Report CS-88-38, Dept. of Comp. Sci., The Pennsylvania state university, October 1988
Parberry, I. — Schnitger, G.: Parallel Computation with Threshold Functions. JCSS 36, 1988, pp. 278–302
Parberry, I. — Schnitger, G.: Relating Boltzmann Machines to Conventional models of Computations. Neural Networks, 2, 1989
Reif, J. H. — Tate, S. R.: On Threshold Circuits and Polynomial Computation. Technical Report, Dept. of Comp. Sci., Duke University, 1988
Robson, J. M.: Linear Size Formulas for Non-deterministic Single Tape Computations. Proc. 11-th Australian Comp. Sci. Conference, Brisbane, Feb. 3–5, 1988
van Emde Boas, P.: Machine Models and Simulations. ITLI Prepublication Series of Computation and Complexity Theory CT-88-95, University of Amsterdam, 1988
Wiedermann, J.: On the Computational Power of Neural Networks and Related Computational Systems. Technical Report OPS-9/1988, Department of Programming Systems, VUSEI-AR, Bratislava, June 1988 (in Slovak), also in Proc. SOFSEM'88, VUSEI-AR Bratislava, November 1988, pp. 73–78
Wiedermann, J.: On the Computational Efficiency of Symmetric Neural Networks. Proc. 14-th Symp. on Math. Found. of Comp. Sci., MFCS'89, LNCS Vol. 379, Springer Verlag, Berlin, 1989, pp. 545–552
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1990 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wiedermann, J. (1990). Complexity issues in discrete neurocomputing. In: Dassow, J., Kelemen, J. (eds) Aspects and Prospects of Theoretical Computer Science. IMYCS 1990. Lecture Notes in Computer Science, vol 464. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-53414-8_32
Download citation
DOI: https://doi.org/10.1007/3-540-53414-8_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-53414-3
Online ISBN: 978-3-540-46869-1
eBook Packages: Springer Book Archive