Nonlinear Dynamical Systems: Feedforward Neural Network Perspectives
Considered one of the most important types of structures in the study of neural networks and neural-like networks, feedforward networks incorporating dynamical elements have important properties and are of use in many applications. Specializing in experiential knowledge, a neural network stores and expands its knowledge base via strikingly human routes-through a learning process and information storage involving interconnection strengths known as synaptic weights.
In Nonlinear Dynamical Systems: Feedforward Neural Network Perspectives, six leading authorities describe recent contributions to the development of an analytical basis for the understanding and use of nonlinear dynamical systems of the feedforward type, especially in the areas of control, signal processing, and time series analysis. Moving from an introductory discussion of the different aspects of feedforward neural networks, the book then addresses:
* Classification problems and the related problem of approximating dynamic nonlinear input-output maps
* The development of robust controllers and filters
* The capability of neural networks to approximate functions and dynamic systems with respect to risk-sensitive error
* Segmenting a time series
It then sheds light on the application of feedforward neural networks to speech processing, summarizing speech-related techniques, and reviewing feedforward neural networks from the viewpoint of fundamental design issues. An up-to-date and authoritative look at the ever-widening technical boundaries and influence of neural networks in dynamical systems, this volume is an indispensable resource for researchers in neural networks and a reference staple for libraries.
1 Feedforward Neural Networks: An Introduction (Simon Haykin).
1.1 Supervised Learning.
1.2 Unsupervised Learning.
1.3 Temporal Processing Using Feedforward Networks.
1.4 Concluding Remarks.
2 Uniform Approximation and Nonlinear Network Structures (Irwin W. Sandberg).
2.2 General Structures for Classification.
2.3 Myopic Maps, Neural Network Approximations, and Volterra Series.
2.4 Separation Conditions and Approximation of Discrete-Time and Discrete-Space Systems.
2.5 Concluding Comments.
3 Robust Neural Networks (James T. Lo).
3.3 General Risk-Sensitive Functionals.
3.4 Approximation of Functions by MLPs.
3.5 Approximation of Functions by RBFs.
3.6 Formulation of Risk-Sensitive Identification of Systems.
3.7 Series-Parallel Identification by Artificial Neural Networks (ANNs).
3.8 Paral lel Identification of ANNs.
4 Modeling, Segmentation, and Classification of Nonlinear Nonstationary Time Series (Craig L. Fancourt and Jose C. Principe).
4.2 Supervised Sequential Change Detection.
4.3 Unsupervised Sequential Segmentation.
4.4 Memoryless Mixture Models.
4.5 Mixture Models for Processes with Memory.
4.6 Gated Competitive Experts.
4.7 Competitive Temporal Principal Component Analysis.
4.8 Output-Based Gating Algorithms.
4.9 Other Approaches.
5 Application of Feedforward Networks to Speech (Shigeru Katagiri).
5.2 Fundamentals of Speech Signals and Processing Technologies.
5.3 Fundamental Issues of ANN Design.
5.4 Speech Recognition.
5.5 Applications to Other Types of Speech Processing.
5.6 Concluding Remarks.
JAMES T. LO teaches in the Department of Mathematics and Statistics, University of Maryland.
CRAIG L. FANCOURT is a member of the Adaptive Image and Signal Processing Group at the Sarnoff Corp. in Princeton, New Jersey.
JOSE C. PRINCIPE is BellSouth Professor in the Electrical and Computer Engineering Department at the University of Florida, Gainesville.
SHIGERU KATAGIRI leads research on speech and hearing at NTT Communication Science Laboratories, Kyoto, Japan.
SIMON HAYKIN teaches at McMaster University in Hamilton, Ontario, Canada. He has authored or coauthored over a dozen Wiley titles.