134.

c++ neural networks and fuzzy logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Previous Table of Contents Next


Chapter 17
Further Applications

Introduction

In this chapter, we present the outlines of some applications of neural networks and fuzzy logic. Most of the applications fall into a few main categories according to the paradigms they are based on. We offer a sampling of topics of research as found in the current literature, but there are literally thousands of applications of neural networks and fuzzy logic in science, technology and business with more and more applications added as time goes on.

Some applications of neural networks are for adaptive control. Many such applications benefit from adding fuzziness also. Steering a car or backing up a truck with a fuzzy controller is an example. A large number of applications are based on the backpropagation training model. Another category of applications deals with classification. Some applications based on expert systems are augmented with a neural network approach. Decision support systems are sometimes designed this way. Another category is made up of optimizers, whose purpose is to find the maximum or the minimum of a function.


NOTE:  You will find other neural network applications related to finance presented toward the end of Chapter 14.

Computer Virus Detector

IBM Corporation has applied neural networks to the problem of detecting and correcting computer viruses. IBM’s AntiVirus program detects and eradicates new viruses automatically. It works on boot-sector types of viruses and keys off of the stereotypical behaviors that viruses usually exhibit. The feedforward backpropagation neural network was used in this application. New viruses discovered are used in the training set for later versions of the program to make them “smarter.” The system was modeled after knowledge about the human immune system: IBM uses a decoy program to “attract” a potential virus, rather than have the virus attack the user’s files. These decoy programs are then immediately tested for infection. If the behavior of the decoy program seems like the program was infected, then the virus is detected on that program and removed wherever it’s found.

Mobile Robot Navigation

C. Lin and C. Lee apply a multivalued Boltzmann machine, modeled by them, using an artificial magnetic field approach. They define attractive and repulsive magnetic fields, corresponding to goal position and obstacle, respectively. The weights on the connections in the Boltzmann machine are none other than the magnetic fields.

They divide a two-dimensional traverse map into small grid cells. Given the goal cell and obstacle cells, the problem is to navigate the two-dimensional mobile robot from an unobstructed cell to the goal quickly, without colliding with any obstacle. An attracting artificial magnetic field is built for the goal location. They also build a repulsive artificial magnetic field around the boundary of each obstacle. Each neuron, a grid cell, will point to one of its eight neighbors, showing the direction for the movement of the robot. In other words, the Boltzmann machine is adapted to become a compass for the mobile robot.

A Classifier

James Ressler and Marijke Augusteijn study the use of neural networks to the problem of weapon to target assignment. The neural network is used as a filter to remove unfeasible assignments, where feasibility is determined in terms of the weapon’s ability to hit a given target if fired at a specific instant. The large number of weapons and threats along with the limitation on the amount of time lend significance to the need for reducing the number of assignments to consider.

The network’s role here is classifier, as it needs to separate the infeasible assignments from the feasible ones. Learning has to be quick, and so Ressler and Augusteijn prefer to use an architecture called the cascade-correlation learning architecture, over backpropagation learning. Their network is dynamic in that the number of hidden layer neurons is determined during the training phase. This is part of a class of algorithms that change the architecture of the network during training.

A Two-Stage Network for Radar Pattern Classification

Mohammad Ahmadian and Russell Pimmel find it convenient to use a multistage neural network configuration, a two-stage network in particular, for classifying patterns. The patterns they study are geometrical features of simulated radar targets.

Feature extraction is done in the first stage, while classification is done in the second. Moreover, the first stage is made up of several networks, each for extracting a different estimable feature. Backpropagation is used for learning in the first stage. They use a single network in the second stage. The effect of noise is also studied.


Previous Table of Contents Next

Copyright © IDG Books Worldwide, Inc.



C++ Neural Networks and Fuzzy Logic
C++ Neural Networks and Fuzzy Logic
ISBN: 1558515526
EAN: 2147483647
Year: 1995
Pages: 139

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net