135.

c++ neural networks and fuzzy logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Previous Table of Contents Next


Crisp and Fuzzy Neural Networks for Handwritten Character Recognition

Paul Gader, Magdi Mohamed, and Jung-Hsien Chiang combine a fuzzy neural network and a crisp neural network for the recognition of handwritten alphabetic characters. They use backpropagation for the crisp neural network and a clustering algorithm called K-nearest neighbor for the fuzzy network. Their consideration of a fuzzy network in this study is prompted by their belief that if some ambiguity is possible in deciphering a character, such ambiguity should be accurately represented in the output. For example, a handwritten “u” could look like a “v” or “u.” If present, the authors feel that this ambiguity should be translated to the classifier output.

Feature extraction was accomplished as follows: character images of size 24x16 pixels were used. The first stage of processing extracted eight feature images from the input image, two for each direction (north, northeast, northwest, and east). Each feature image uses an integer at each location that represents the length of the longest bar that fits at that point in that direction. These are referred to as bar features. Next 8x8 overlapping zones are used on the feature images to derive feature vectors. These are made by taking the summed values of the values in a zone and dividing this by the maximum possible value in the zone. Each feature image results in a 15,120 element feature vectors.

Data was obtained from the U.S. Postal Office, consisting of 250 characters. Results showed 97.5% and 95.6% classification rates on training and test sets, respectively, for the neural network. The fuzzy network resulted in 94.7% and 93.8% classification rates, where the desired output for many characters was set to ambiguous.

Noise Removal with a Discrete Hopfield Network

Arun Jagota applies what is called a HcN, a special case of a discrete Hopfield network, to the problem of recognizing a degraded printed word. HcN is used to process the output of an Optical Character Recognizer, by attempting to remove noise. A dictionary of words is stored in the HcN and searched.

Object Identification by Shape

C. Ganesh, D. Morse, E. Wetherell, and J. Steele used a neural network approach to an object identification system, based on the shape of an object and independent of its size. A two-dimensional grid of ultrasonic data represents the height profile of an object. The data grid is compressed into a smaller set that retains the essential features. Backpropagation is used. Recognition on the order of approximately 70% is achieved.

Detecting Skin Cancer

F. Ercal, A. Chawla, W. Stoecker, and R. Moss study a neural network approach to the diagnosis of malignant melanoma. They strive to discriminate tumor images as malignant or benign. There are as many as three categories of benign tumors to be distinguished from malignant melanoma. Color images of skin tumors are used in the study. Digital images of tumors are classified. Backpropagation is used. Two approaches are taken to reduce training time. The first approach involves using fewer hidden layers, and the second involves randomization of the order of presentation of the training set.

EEG Diagnosis

Fred Wu, Jeremy Slater, R. Eugene Ramsay, and Lawrence Honig use a feedforward backpropagation neural network as a classifier in EEG diagnosis. They compare the performance of the neural network classifier to that of a nearest neighbor classifier. The neural network classifier shows a classifier accuracy of 75% for Multiple Sclerosis patients versus 65% for the nearest neighbor algorithm.

Time Series Prediction with Recurrent and Nonrecurrent Networks

Sathyanarayan Rao and Sriram Sethuraman take a recurrent neural network and a feedforward network and train then in parallel. A recurrent neural network has feedback connections from the output neurons back to input neurons to model the storage of temporal information. A modified backpropagation algorithm is used for training the recurrent network, called the real-time recurrent learning algorithm. They have the recurrent neural network store past information, and the feedforward network do the learning of nonlinear dependencies on the current samples. They use this scheme because the recurrent network takes more than one time period to evaluate its output, whereas the feedforward network does not. This hybrid scheme overcomes the latency problem for the recurrent network, providing immediate nonlinear evaluation from input to output.

Security Alarms

Deborah Frank and J. Bryan Pletta study the application of neural networks for alarm classification based on their operation under varying weather conditions. Performance degradation of a security system when the environment changes is a cause for losing confidence in the system itself. This problem is more acute with portable security systems.

They investigated the problem using several networks, ranging from backpropagation to learning vector quantization. Data was collected using many scenarios, with and without the coming of an intruder, which can be a vehicle or a human.

They found a 98% probability of detection and 9% nuisance alarm rate over all weather conditions.


Previous Table of Contents Next

Copyright © IDG Books Worldwide, Inc.



C++ Neural Networks and Fuzzy Logic
C++ Neural Networks and Fuzzy Logic
ISBN: 1558515526
EAN: 2147483647
Year: 1995
Pages: 139

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net