72.

c++ neural networks and fuzzy logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Previous Table of Contents Next


Algorithm for ART1 Calculations

The ART1 equations are not easy to follow. We follow the description of the algorithm found in James A. Freeman and David M. Skapura. The following equations, taken in the order given, describe the steps in the algorithm. Note that binary input patterns are used in ART1.

Initialization of Parameters


wij should be positive and less than L / ( m - 1 + L)
vji should be greater than ( B - 1 ) / D
ai = -B / ( 1 + C )

Equations for ART1 Computations

When you read below the equations for ART1 computations, keep in mind the following considerations. If a subscript i appears on the left-hand side of the equation, it means that there are m such equations, as the subscript i varies from 1 to m. Similarly, if instead a subscript j occurs, then there are n such equations as j ranges from 1 to n. The equations are used in the order they are given. They give a step-by-step description of the following algorithm. All the variables, you recall, are defined in the earlier section on notation. For example, I is the input vector.

F1 layer calculations:

        ai = Ii / ( 1 + A ( Ii + B ) + C )        xi = 1 if ai > 0             0 if ai ≤ 0 

F2 layer calculations:

        bj = Σ wij xi, the summation being on i from 1 to m        yj = 1 if jth neuron has the largest activation value in the F2             layer           = 0 if jth neuron is not the winner in F2 layer 

Top-down inputs:

        zi = Σvjiyj, the summation being on j from 1 to n (You will        notice that exactly one term is nonzero) 

F1 layer calculations:

        ai = ( Ii + D zi - B ) / ( 1 + A ( Ii + D zi ) + C )        xi = 1 if ai > 0           = 0 if ai ≤ 0 

Checking with vigilance parameter:

If ( Sx / SI ) <Σ, set yj = 0 for all j, including the winner r in F2 layer, and consider the jth neuron inactive (this step is reset, skip remaining steps).

If ( Sx / SI ) ≥ Σ, then continue.

Modifying top-down and bottom-up connection weight for winner r:

        vir  = ( L / ( Sx + L -1 ) if xi = 1             = 0 if xi = 0        wri  = 1 if xi = 1             = 0 if xi = 0 

Having finished with the current input pattern, we repeat these steps with a new input pattern. We lose the index r given to one neuron as a winner and treat all neurons in the F2 layer with their original indices (subscripts).

The above presentation of the algorithm is hoped to make all the steps as clear as possible. The process is rather involved. To recapitulate, first an input vector is presented to the F1 layer neurons, their activations are determined, and then the threshold function is used. The outputs of the F1 layer neurons constitute the inputs to the F2 layer neurons, from which a winner is designated on the basis of the largest activation. The winner only is allowed to be active, meaning that the output is 1 for the winner and 0 for all the rest. The equations implicitly incorporate the use of the 2/3 rule that we mentioned earlier, and they also incorporate the way the gain control is used. The gain control is designed to have a value 1 in the phase of determining the activations of the neurons in the F2 layer and 0 if either there is no input vector or output from the F2 layer is propagated to the F1 layer.

Other Models

Extensions of an ART1 model, which is for binary patterns, are ART2 and ART3. Of these, ART2 model categorizes and stores analog-valued patterns, as well as binary patterns, while ART3 addresses computational problems of hierarchies.

C++ Implementation

Again, the algorithm for ART1 processing as given in Freeman and Skapura is followed for our C++ implementation. Our objective in programming ART1 is to provide a feel for the workings of this paradigm with a very simple program implementation. For more details on the inner workings of ART1, you are encouraged to consult Freeman and Skapura, or other references listed at the back of the book.

A Header File for the C++ Program for the ART1 Model Network

The header file for the C++ program for the ART1 model network is art1net.hpp. It contains the declarations for two classes, an artneuron class for neurons in the ART1 model, and a network class, which is declared as a friend class in the artneuron class. Functions declared in the network class include one to do the iterations for the network operation, finding the winner in a given iteration, and one to inquire if reset is needed.

 //art1net.h   V. Rao,  H. Rao //Header file for ART1 model network program #include <iostream.h> #define MXSIZ 10 class artneuron { protected:        int nnbr;        int inn,outn;        int output;        double activation;        double outwt[MXSIZ];        char *name;        friend class network; public:        artneuron() { };        void getnrn(int,int,int,char *); }; class network { public:        int  anmbr,bnmbr,flag,ninpt,sj,so,winr;        float ai,be,ci,di,el,rho;        artneuron (anrn)[MXSIZ],(bnrn)[MXSIZ];        int outs1[MXSIZ],outs2[MXSIZ];        int lrndptrn[MXSIZ][MXSIZ];        double acts1[MXSIZ],acts2[MXSIZ];        double mtrx1[MXSIZ][MXSIZ],mtrx2[MXSIZ][MXSIZ];        network() { };        void getnwk(int,int,float,float,float,float,float);        void prwts1();        void prwts2();        int winner(int k,double *v,int);        void practs1();        void practs2();        void prouts1();        void prouts2();        void iterate(int *,float,int);        void asgninpt(int *);        void comput1(int);        void comput2(int *);        void prlrndp();        void inqreset(int);        void adjwts1();        void adjwts2(); }; 


Previous Table of Contents Next

Copyright © IDG Books Worldwide, Inc.



C++ Neural Networks and Fuzzy Logic
C++ Neural Networks and Fuzzy Logic
ISBN: 1558515526
EAN: 2147483647
Year: 1995
Pages: 139

Similar book on Amazon

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net