Interactive Neural Network Book
The interactive book "Neural and Adaptive Systems: Fundamentals Through Simulations (ISBN: 0471351679)" by Principe, Euliano, and Lefebvre, has been published by John Wiley and Sons and is available for purchase directly through Amazon.com. Our enthusiasm for this book is best expressed by the response of our readers.
This paperback book combines the hypertext and searching capabilities of the Windows help system with the highly graphical simulation environment of NeuroSolutions to produce a revolutionary teaching tool. The book contains over 200 interactive experiments in NeuroSolutions to elucidate the fundamentals of neural networks and adaptive systems.
The inclusion of interactive experiments in the book allows for the presentation of key concepts without the use of complex equations. The lead author of the book is Dr. Jose Principe, director of the Computational NeuroEngineering Laboratory atthe University of Florida. He is currently using this book in the Neural Networks for Signal Processing as the text for an undergraduate course in neural networks. Previously this course could not be taught at an undergraduate level due to the complex mathematical background that was required.
Included in the full evaluation version of NeuroSolutions is the entire first chapter of the book with examples. For a preview of the book without the experiments, the first chapter is also available in html. Remember, though, that the experiments are
fundamental to the presentation and understanding of the topics. Downloading the full Evaluation version of NeuroSolutions provides a much better preview of the book.
Neural and Adaptive Systems: Fundamentals Through Simulations
Jose C. Principe
Neil R. Euliano
W. Curt Lefebvre
Copyright 2000, John Wiley and Sons
Table of Contents
Chapter I  Data Fitting with
Linear Models
1. Introduction
2. Linear Models
3. Least Squares
4. Adaptive Linear System
5. Estimation of the Gradient: The LMS Algorithm
6. A Methodology for Stable Adaptation
7. Regression for Multiple Variables
8. Newton's Method
9. Analytic verses Iterative Solutions
10. The Linear Regession Model
11. Conclusions
Chapter II  Pattern
Recognition
1. The Pattern Recognition Problem
2. Parametric Classifiers
3. Linear and Nonlinear Classifier Machines
4. Methods of Training Parametric Classifiers
5. Conclusions
Chapter III  Multilayer
Perceptrons
1. Artificial Neural Networks
2. Pattern Recognition Ability of the McCullochPitts
PE
3. The Perceptron
4. One Hidden Layer Multilayer Perceptrons
5. MLPs with Two Hidden Layers
6. Training Static Networks with the Backpropagation
Procedure
7. Training Embedded Adaptive Systems
8. MLPs as Optimal Classifiers
9. Conclusions
Chapter IV  Designing and
Training MLPs
1. Introduction
2. Controlling Learning in Practice
3. Other Search Procedures
4. Stop Criteria
5. How Good are MLPs as Learning Machines?
6. Error Criterion
7. Network Size and Generalization
8. Project: Application of the MLP to RealWorld Data
9. Conclusion
Chapter V Function
Approximation with MLPs and Radial Basis Functions
1. Introduction
2. Function Approximation
3. Choices for the Elementary Functions
4. Probabilistic Interpretation of the Mappings:
Nonlinear Regression
5. Training Neural Networks for Function
Approximation
6. How to Select the Number of Bases
7. Applications of Radial Basis Functions
8. Support Vector Machines
9. Project: Applications of Neural Networks as
Function Approximators
10 Conclusion
Chapter VI Hebbian Learning
and Principal Component Analysis
1. Introduction
2. Effect of the Hebbian Update
3. Oja's Rule
4. Principal Component Analysis
5. AntiHebbian Learning
6. Estimating Crosscorrelation with Hebbian Networks
7. Novelty Filters and Lateral Inhibition
8. Linear Associative Memories (LAMs)
9. LMS Learning as a Combination of Hebbian Rules
10. Autoassociation
11. Nonlinear Associative Memories
12. Project: Use of Hebbian Networks for Data
Compression and Associative Memories
13 Conclusions
Chapter VII Competitive and
Kohonen Networks
1. Introduction
2. Competition and WinnerTakeAll Networks
3. Competitive Learning
4. Clustering
5. Improving Competitive Learning
6. Soft Competition
7. Kohonen SelfOrganizing Map
8. Creating Classifiers from Competitive Networks
9. Adaptive Resonance Theory (ART)
10. Modular Networks
11 Conclusions
Chapter VIII  Principles of
Digital Signal Processing
1. Time Series and Computers
2. Vectors and Discrete Signals
3. The Concept of Filtering
4. Time Domain Analysis of Linear Systems
5. Recurrent Systems and Stability
6. Frequency Domain Analysis
7. The ZTransform and the System Transfer Function
8. The Frequency Response
9. Frequency Response and Poles and Zeros
10. Types of Linear Filters
11 Project: Design of Digital Filters
12 Conclusions
Chapter IX  Adaptive Filters
1. Introduction
2. The Adaptive Linear Combiner and Linear Regression
3. Optimum Filter Weights
4. Properties of the Adaptive Solution
5. Hebbian Networks for Time Processing
6. Applications of the Adaptive Linear Combiner
7. Applications of Temporal PCA Networks
8. Conclusions
Chapter X  Temporal Processing
with Neural Networks
1. Static versus Dynamic Systems
2. Extracting Information in Time
3. The Focused Time Delay Neural Network (TDNN)
4. The Memory PE
5. The Memory Filter
6. Design of the Memory Space
7. The Gamma Memory PE
8. Time Lagged Feedforward Networks
9. Focused TLFNs Built from RBFs
10. Project: Iterative Prediction of Chaotic Time
Series
11. Conclusions
Chapter XI  Training and Using
Recurrent Networks
1. Introduction
2. Simple Recurrent Topolgies
3. Adapting the Feedback Parameter
4. Unfolding Recurrent Networks in Time
5. The Distributed TLFN Topology
6. Dynamic Systems
7. Recurrent Neural Networks
8. Learning Paradigms for Recurrent Systems
9. Applications of Dynamic Networks to System
Identification and Control
10. Hopfield Networks
11. Grossberg's Additive Model
12. Beyond First Order Dynamics: Freeman's Model
13. Conclusions
Appendix A  Elements of Linear
Algebra and Pattern Recognition
1. Introduction
2. Vectors: Concepts and Definitions
3. Matrices: Concepts and Definitions
4. Random Vectors
5. Conclusions
Appendix B  NeuroSolutions
Tutorial
1. Introduction to NeuroSolutions
2. Introduction to Interactive Examples
3. The Fundamentals of NeuroSolutions
4. Using Probes in NeuroSolutions
5. Providing Input to your Networks
6. Training a Network
7. Summary
