Корично изображение Електронен

Neural networks in pattern recognition and their applications

The revitalization of neural network research in the past few years has already had a great impact on research and development in pattern recognition and artificial intelligence. Although neural network functions are not limited to pattern recognition, there is no doubt that a renewed progress in pa...

Пълно описание

Други автори: Chen, C. H. 1937-
Формат: Електронен
Език: English
Публикувано: Singapore ; River Edge, N.J. : World Scientific, ℗♭1991.
Предмети:
Онлайн достъп: http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=532654
Подобни документи: Print version:: Neural networks in pattern recognition and their applications.
Съдържание:
  • Introduction; contents; combined neural-net/knowledge-based adaptive systems for large scale dynamic control; 1. introduction; 2. complex systems; 3. the dynamics of supervised learning; 4. analysis of effect of hint functions; 5. automatic generation of constraints (hints); 6. general pattern classifier; 7. network formation during training; 8. example npsn design cycle; references; a connectionist incremental expert system combining production systems and associative memory; 1. introduction; 2. memory representation of the system; 3. analysis and generalization of the perceptron algorithm.
  • 3.1. Learning Algorithm for Uncertain Samples3.2. A Learning Algorithm with Dynamic Network Structure; 4. CONTROL AND DEVELOPMENT OF THE SYSTEM; 5. EXPERIMENTS WITH AN ANIMAL IDENTIFICATION SYSTEM; 6. CONCLUSIONS; ACKNOWLEDGEMENT; REFERENCES; OPTIMAL HIDDEN UNITS FOR TWO-LAYER NONLINEAR FEEDFORWARD NEURAL NETWORKS; 1. INTRODUCTION; 2. THEORY; 3. TRAINING; 4. EXAMPLES; 5. CONTINUOUS CASE; 6. DISCUSSION; ACKNOWLEDGEMENTS; REFERENCES; AN INCREMENTAL FINE ADJUSTMENT ALGORITHM FOR THE DESIGN OF OPTIMAL INTERPOLATING NEURAL NETWORKS; 1. INTRODUCTION.
  • 2. STAGE 1: INCREMENTAL ACQUISITION OF OPTIMAL NETWORK CONFIGURATION3. STAGE 2: ITERATIVE FINE ADJUSTMENT OF WEIGHT MATRICES; 4. EXPERIMENTAL RESULTS; 4.1. Part I: Results on Synthetic Data; 4.2. Part II: Results on Real Data; 5. CONCLUSION; REFERENCES; APPENDIX. A SUMMARY OF THE INCREMENTAL NETWORK CONFIGURATION ACQUISITION ALGORITHM; ON THE ASYMPTOTIC PROPERTIES OF RECURRENT NEURAL NETWORKS FOR OPTIMIZATION; 1. INTRODUCTION; 2. PROBLEM FORMULATION; 3. ASYMPTOTIC ANALYSIS; 3.1. Basic Components of Recurrent Neural Networks; 3.2. Asymptotic Stability of Activation States.
  • 3.3. Feasibility of Generated Solutions3.4. Optimality of Generated Solutions; 4. NETWORK CONFIGURATION; 4.1. Activation Function of Recurrent Neural Networks; 4.2. Aggregation Function of Recurrent Neural Networks; 4.3. Penalization Function of Recurrent Neural Networks; 5. ILLUSTRATIVE EXAMPLES; 6. CONCLUSION; REFERENCES; A REAL-TIME IMAGE SEGMENTATION SYSTEM USING A CONNECTIONIST CLASSIFIER ARCHITECTURE; 1. INTRODUCTION; 2. SEGMENTATION ENGINE ARCHITECTURE; 2.1. Feature Extraction; 2.2. Connectionist Classifier; 3. PARAMETER ADAPTATION; 3.1. Feature Selection; 3.2. Network Adaptation.
  • 4. EXPERIMENTAL RESULTS4.1. Combustion Chamber Images; 4.2. Printed Circuit Board Images; 4.3. Texture Images; 5. CONCLUSIONS; ACKNOWLEDGEMENT; REFERENCES; SEGMENTATION OF ULTRASONIC IMAGES WITH NEURAL NETWORKS; 1. INTRODUCTION; 2. METHODS; 3. RESULTS; 4. CONCLUSION; REFERENCES; CONNECTIONIST MODEL BINARIZATION; 1. INTRODUCTION; 2. IMAGE BINARIZATION AND ITS PROBLEM; 3. CONNECTIONIST MODEL BINARIZATION; 3.1. Outline; 3.2. Network; 3.3. Learning Stage; 3.4. Translating Stage; 4. EXPERIMENTAL RESULTS AND DISCUSSION; 4.1. Learning Properties; 4.2. Binarization Performance; 4.3. Network Structure.