Корично изображение Електронен

Evolution, learning, and cognition

This review volume represents the first attempt to provide a comprehensive overview of this exciting and rapidly evolving development. The book comprises specially commissioned articles by leading researchers in the areas of neural networks and connectionist systems, classifier systems, adaptive net...

Пълно описание

Други автори: Lee, Y. C.
Формат: Електронен
Език: English
Публикувано: Singapore ; Teaneck, N.J., USA : World Scientific, ℗♭1988.
Предмети:
Онлайн достъп: http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=575367
Подобни документи: Print version:: Evolution, learning, and cognition.
Съдържание:
  • PREFACE; CONTENTS; Part One MATHEMATICAL THEORY; Connectionist Learning Through Gradient Following; INTRODUCTION; CONNECTIONIST SYSTEMS; LEARNING; Supervised Learning vs. Associative Reinforcement Learning; FORMAL ASSUMPTIONS AND NOTATION; BACK-PROPAGATION ALGORITHM FOR SUPERVISED LEARNING; Extended Back-Propagation; REINFORCE ALGORITHMS FOR ASSOCIATIVE REINFORCEMENT LEARNING; Extended REINFORCE Algorithms; DISCUSSION; SUMMARY; REFERENCES; Efficient Stochastic Gradient Learning Algorithm for Neural Network; 1 Introduction; 2 Learning as Stochastic Gradient Descents.
  • 3 Convergence Theorems for First Order Schemes4 Convergence of the Second Order Schemes; 5 Discussion; References; INFORMATION STORAGE IN FULLY CONNECTED NETWORKS; 1 INTRODUCTION; 1.1 Neural Networks; 1.2 Organisation; 1.3 Notation; 2 THE MODEL OF McCULLOCH-PITTS; 2.1 State-Theoretic Description; 2.2 Associative Memory; 3 THE OUTER-PRODUCT ALGORITHM; 3.1 The Model; 3.2 Storage Capacity; 4 SPECTRAL ALGORITHMS; 4.1 Outer-Products Revisited; 4.2 Constructive Spectral Approaches; 4.3 Basins of Attraction; 4.4 Choice of Eigenvalues; 5 COMPUTER SIMULATIONS; 6 DISCUSSION; A PROPOSITIONS.
  • B OUTER-PRODUCT THEOREMSC PROOFS OF SPECTRAL THEOREMS; References; NEURONIC EQUATIONS AND THEIR SOLUTIONS; 1. Introduction; 1.1. Reminiscing; 1.2. The 1961 Model; 1.3. Notation; 2. Linear Separable NE; 2.1. Neuronic Equations; 2.2. Polygonal Inequalities; 2.3. Computation of the n-expansion of arbitrary l.s. functions; 2.4. Continuous versus discontinuous behaviour: transitions; 3. General Boolean NE; 3.1. Linearization in tensor space; 3.2. Next-state matrix; 3.3. Normal modes, attractors; 3.4. Synthesis of nets: the inverse problem; 3.5. Separable versus Boolean nets.
  • Connections with spin formalismReferences; The Dynamics of Searches Directed by Genetic Algorithms; The Hyperplane Transformation.; The Genetic Algorithm as a Hyperplane-Directed Search Procedure; (1) Description of the genetic algorithm; (2) Effects of the S's on the search generated by a genetic algorithm.; (3) An Example.; References.; PROBABILISTIC NEURAL NETWORKS; 1. INTRODUCTION; 2. MODELING THE NOISY NEURON; 2.1. Empirical Properties of Neuron and Synapse; 22. Model of Shaw and Vasudevan; 2.3. Model of Little; 2.4. Model of Taylor.
  • 3. NONEQUILIBRIUM STATISTICAL MECHANICS OF LINEAR MODELS3.1. Statistical Law of Motion
  • Markov Chain and Master Equation; 3.2. Entropy Production in the Neural; 3.3. Macroscopic Forces and Fluxes; 3.4. Conditions for Thermodynamic Equilibrium; 3.5. Implications for Memory Storage: How Dire?; 4. DYNAMICAL PROPERTIES OF NONLINEAR MODELS; 4.1. Views of Statistical Dynamics; 4.2. Multineuron Interactions, Revisited; 4.3. Cognitive Aspects of the Taylor Model; 4.4. Noisy RAMS and Noisy Nets; 5. THE END OF THE BEGINNING; ACKNOWLEDGMENTS; APPENDIX. TRANSITION PROBABILITIES IN 2-NEURON NETWORKS.