C++ Neural Networks and Fuzzy Logic by Valluru B. Rao M&T Books, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 |

- Preface
**Dedication**

**Chapter 1—Introduction to Neural Networks**- Neural Processing
- Neural Network
- Output of a Neuron
- Cash Register Game
- Weights
- Training
- Feedback
- Supervised or Unsupervised Learning
- Noise
- Memory
- Capsule of History
- Neural Network Construction
- Sample Applications
- Qualifying for a Mortgage

- Cooperation and Competition
- Example—A Feed-Forward Network
- Example—A Hopfield Network
- Hamming Distance
- Asynchronous Update
- Binary and Bipolar Inputs
- Bias

- Another Example for the Hopfield Network
- Summary

**Chapter 2—C++ and Object Orientation**- Introduction to C++
- Encapsulation
- Data Hiding
- Constructors and Destructors as Special Functions of C++
- Dynamic Memory Allocation
- Overloading
- Polymorphism and Polymorphic Functions
- Overloading Operators
- Inheritance
- Derived Classes

- Reuse of Code
- C++ Compilers
- Writing C++ Programs
- Summary

**Chapter 3—A Look at Fuzzy Logic**- Crisp or Fuzzy Logic?
- Fuzzy Sets
- Fuzzy Set Operations
- Union of Fuzzy Sets
- Intersection and Complement of Two Fuzzy Sets

- Applications of Fuzzy Logic
- Examples of Fuzzy Logic
- Commercial Applications

- Fuzziness in Neural Networks
- Code for the Fuzzifier
- Fuzzy Control Systems
- Fuzziness in Neural Networks
- Neural-Trained Fuzzy Systems
- Summary

**Chapter 4—Constructing a Neural Network**- First Example for C++ Implementation
- Classes in C++ Implementation

- C++ Program for a Hopfield Network
- Header File for C++ Program for Hopfield Network
- Notes on the Header File Hop.h
- Source Code for the Hopfield Network
- Comments on the C++ Program for Hopfield Network
- Output from the C++ Program for Hopfield Network
- Further Comments on the Program and Its Output

- A New Weight Matrix to Recall More Patterns
- Weight Determination
- Binary to Bipolar Mapping
- Pattern’s Contribution to Weight

- Autoassociative Network
- Orthogonal Bit Patterns
- Network Nodes and Input Patterns
- Second Example for C++ Implementation
- C++ Implementation of Perceptron Network
- Header File
- Implementation of Functions
- Source Code for Perceptron Network
- Comments on Your C++ Program
- Input/Output for percept.cpp

- Network Modeling
- Tic-Tac-Toe Anyone?
- Stability and Plasticity
- Stability for a Neural Network
- Plasticity for a Neural Network

- Short-Term Memory and Long-Term Memory
- Summary

- First Example for C++ Implementation
**Chapter 5—A Survey of Neural Network Models**- Neural Network Models
- Layers in a Neural Network
- Single-Layer Network

- XOR Function and the Perceptron
- Linear Separability
- A Second Look at the XOR Function: Multilayer Perceptron
- Example of the Cube Revisited
- Strategy
- Details
- Performance of the Perceptron

- Other Two-layer Networks
- Many Layer Networks
- Connections Between Layers
- Instar and Outstar
- Weights on Connections
- Initialization of Weights
- A Small Example
- Initializing Weights for Autoassociative Networks
- Weight Initialization for Heteroassociative Networks

- On Center, Off Surround
- Inputs
- Outputs
- The Threshold Function
- The Sigmoid Function
- The Step Function
- The Ramp Function
- Linear Function

- Applications
- Some Neural Network Models
- Adaline and Madaline
- Backpropagation
- Figure for Backpropagation Network
- Bidirectional Associative Memory
- Temporal Associative Memory
- Brain-State-in-a-Box
- Counterpropagation
- Neocognitron
- Adaptive Resonance Theory

- Summary

**Chapter 6—Learning and Training**- Objective of Learning
- Learning and Training
- Hebb’s Rule
- Delta Rule

- Supervised Learning
- Generalized Delta Rule
- Statistical Training and Simulated Annealing
- Radial Basis-Function Networks

- Unsupervised Networks
- Self-Organization

- Learning Vector Quantizer
- Associative Memory Models and One-Shot Learning
- Learning and Resonance
- Learning and Stability
- Training and Convergence
- Lyapunov Function
- Other Training Issues
- Adaptation
- Generalization Ability

- Summary

**Chapter 7—Backpropagation**- Feedforward Backpropagation Network
- Mapping
- Layout
- Training

- Illustration: Adjustment of Weights of Connections from a Neuron in the Hidden Layer
- Illustration: Adjustment of Weights of Connections from a Neuron in the Input Layer
- Adjustments to Threshold Values or Biases
- Another Example of Backpropagation Calculations
- Notation and Equations
- Notation
- Equations

- C++ Implementation of a Backpropagation Simulator
- A Brief Tour of How to Use the Simulator
- C++ Classes and Class Hierarchy

- Summary

- Feedforward Backpropagation Network
**Chapter 8—BAM: Bidirectional Associative Memory**- Introduction
- Inputs and Outputs
- Weights and Training
- Example

- Recall of Vectors
- Continuation of Example
- Special Case—Complements

- C++ Implementation
- Program Details and Flow

- Program Example for BAM
- Header File
- Source File
- Program Output

- Additional Issues
- Unipolar Binary Bidirectional Associative Memory
- Summary

**Chapter 9—FAM: Fuzzy Associative Memory**- Introduction
- Association
- FAM Neural Network
- Encoding
- Example of Encoding
- Recall

- C++ Implementation
- Program details
- Header File
- Source File
- Output

- Summary

**Chapter 10—Adaptive Resonance Theory (ART)**- Introduction
- The Network for ART1
- A Simplified Diagram of Network Layout
- Processing in ART1
- Special Features of the ART1 Model
- Notation for ART1 Calculations
- Algorithm for ART1 Calculations
- Initialization of Parameters
- Equations for ART1 Computations

- Other Models
- C++ Implementation
- A Header File for the C++ Program for the ART1 Model Network
- A Source File for C++ Program for an ART1 Model Network

- Program Output
- Summary

**Chapter 11—The Kohonen Self-Organizing Map**- Introduction
- Competitive Learning
- Normalization of a Vector

- Lateral Inhibition
- The Mexican Hat Function

- Training Law for the Kohonen Map
- Significance of the Training Law
- The Neighborhood Size and Alpha

- C++ Code for Implementing a Kohonen Map
- The Kohonen Network
- Modeling Lateral Inhibition and Excitation

- Classes to be Used
- Revisiting the Layer Class
- A New Layer Class for a Kohonen Layer

- Implementation of the Kohonen Layer and Kohonen Network
- Flow of the Program and the main() Function
- Flow of the Program

- Results from Running the Kohonen Program
- A Simple First Example
- Orthogonal Input Vectors Example

- Variations and Applications of Kohonen Networks
- Using a Conscience
- LVQ: Learning Vector Quantizer
- Counterpropagation Network
- Application to Speech Recognition

- Summary

**Chapter 12—Application to Pattern Recognition**- Using the Kohonen Feature Map
- An Example Problem: Character Recognition

- C++ Code Development
- Changes to the Kohonen Program
- Testing the Program
- Generalization versus Memorization
- Adding Characters
- Other Experiments to Try

- Summary

- Using the Kohonen Feature Map
**Chapter 13—Backpropagation II**- Enhancing the Simulator
- Another Example of Using Backpropagation

- Adding the Momentum Term
- Code Changes

- Adding Noise During Training
- One Other Change—Starting Training from a Saved Weight File
- Trying the Noise and Momentum Features
- Variations of the Backpropagation Algorithm
- Applications
- Summary

- Enhancing the Simulator
**Chapter 14—Application to Financial Forecasting**- Introduction
- Who Trades with Neural Networks?
- Developing a Forecasting Model
- The Target and the Timeframe
- Domain Expertise
- Gather the Data
- Pre processing the Data for the Network
- Reduce Dimensionality
- Eliminate Correlated Inputs Where Possible
- Design a Network Architecture
- The Train/Test/Redesign Loop

- Forecasting the S&P 500
- Choosing the Right Outputs and Objective
- Choosing the Right Inputs
- Choosing a Network Architecture

- Preprocessing Data
- A View of the Raw Data
- Highlight Features in the Data
- Normalizing the Range
- The Target
- Storing Data in Different Files

- Training and Testing
- Using the Simulator to Calculate Error
- Only the Beginning
- What’s Next?

- Technical Analysis and Neural Network Preprocessing
- Moving Averages
- Momentum and Rate of Change
- Relative Strength Index
- Percentage R
- Herrick Payoff Index
- MACD
- “Stochastics”
- On-Balance Volume
- Accumulation-Distribution

- What Others Have Reported
- Can a Three-Year-Old Trade Commodities?
- Forecasting Treasury Bill and Treasury Note Yields
- Neural Nets versus Box-Jenkins Time-Series Forecasting
- Neural Nets versus Regression Analysis
- Hierarchical Neural Network
- The Walk-Forward Methodology of Market Prediction
- Dual Confirmation Trading System
- A Turning Point Predictor
- The S&P 500 and Sunspot Predictions
- A Critique of Neural Network Time-Series Forecasting for Trading

- Resource Guide for Neural Networks and Fuzzy Logic in Finance
- Magazines
- Books
- Book Vendors
- Consultants
- Historical Financial Data Vendors
- Preprocessing Tools for Neural Network Development
- Genetic Algorithms Tool Vendors
- Fuzzy Logic Tool Vendors
- Neural Network Development Tool Vendors

- Summary

**Chapter 15—Application to Nonlinear Optimization**- Introduction
- Neural Networks for Optimization Problems
- Traveling Salesperson Problem
- The TSP in a Nutshell
- Solution via Neural Network
- Example of a Traveling Salesperson Problem for Hand Calculation

- Neural Network for Traveling Salesperson Problem
- Network Choice and Layout
- Inputs
- Activations, Outputs, and Their Updating
- Performance of the Hopfield Network
- C++ Implementation of the Hopfield Network for the Traveling Salesperson Problem
- Source File for Hopfield Network for Traveling Salesperson Problem
- Output from Your C++ Program for the Traveling Salesperson Problem
- Other Approaches to Solve the Traveling Salesperson Problem

- Optimizing a Stock Portfolio
- Tabu Neural Network
- Summary

**Chapter 16—Applications of Fuzzy Logic**- Introduction
- A Fuzzy Universe of Applications

- Section I: A Look at Fuzzy Databases and Quantification
- Databases and Queries
- Relations in Databases
- Fuzzy Scenarios
- Fuzzy Sets Revisited
- Fuzzy Relations
- Matrix Representation of a Fuzzy Relation
- Properties of Fuzzy Relations
- Similarity Relations
- Resemblance Relations
- Fuzzy Partial Order
- Fuzzy Queries
- Extending Database Models
- Example
- Possibility Distributions
- Example
- Queries

- Fuzzy Events, Means and Variances
- Example: XYZ Company Takeover Price
- Probability of a Fuzzy Event
- Fuzzy Mean and Fuzzy Variance
- Conditional Probability of a Fuzzy Event
- Conditional Fuzzy Mean and Fuzzy Variance

- Linear Regression a la Possibilities
- Fuzzy Numbers
- Triangular Fuzzy Number
- Linear Possibility Regression Model

- Section II: Fuzzy Control
- Designing a Fuzzy Logic Controller
- Step One: Defining Inputs and Outputs for the FLC
- Step Two: Fuzzify the Inputs
- Step Three: Set Up Fuzzy Membership Functions for the Output(s)
- Step Four: Create a Fuzzy Rule Base
- Step Five: Defuzzify the Outputs

- Advantages and Disadvantages of Fuzzy Logic Controllers
- Summary

- Introduction
**Chapter 17—Further Applications**- Introduction
- Computer Virus Detector
- Mobile Robot Navigation
- A Classifier
- A Two-Stage Network for Radar Pattern Classification
- Crisp and Fuzzy Neural Networks for Handwritten Character Recognition
- Noise Removal with a Discrete Hopfield Network
- Object Identification by Shape
- Detecting Skin Cancer
- EEG Diagnosis
- Time Series Prediction with Recurrent and Nonrecurrent Networks
- Security Alarms
- Circuit Board Faults
- Warranty Claims
- Writing Style Recognition
- Commercial Optical Character Recognition
- ART-EMAP and Object Recognition
- Summary

Copyright © IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic

ISBN: 1558515526

EAN: 2147483647

EAN: 2147483647

Year: 1995

Pages: 139

Pages: 139

Authors: Valluru B. Rao, Hayagriva Rao

Similar book on Amazon

flylib.com © 2008-2017.

If you may any questions please contact us: flylib@qtcs.net

If you may any questions please contact us: flylib@qtcs.net