C++ Neural Networks and Fuzzy Logic by Valluru B. Rao M&T Books, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 |

Previous | Table of Contents | Next |

Note the use of the output stream operator **cout<<** to output text strings or numerical output. C++ has **istream** and **ostream** classes from which the **iostream** class is derived. The standard input and output streams are **cin** and **cout**, respectively, used, correspondingly, with the operators >> and <<. Use of **cout** for the output stream is much simpler than the use of the C function **printf**. As you can see, there is no formatting suggested for output. However, there is a provision that allows you to format the output, while using **cout**.

Also note the way comments are introduced in the program. The line with comments should start with a double slash //. Unlike C, the comment does not have to end with a double slash. Of course, if the comments extend to subsequent lines, each such line should have a double slash at the start. You can still use the pair, /* at the beginning with */ at the end of lines of comments, as you do in C. If the comment continues through many lines, the C facility will be handier to delimit the comments.

The neurons in the network are members of the network class and are identified by the abbreviation **nrn**. The two patterns, 1010 and 0101, are presented to the network one at a time in the program.

The output from this program is as follows and is self-explanatory. When you run this program, you’re likely to see a lot of output whiz by, so in order to leisurely look at the output, use redirection. Type **Hop > filename**, and your output will be stored in a file, which you can edit with any text editor or list by using the **type filename | more** command.

THIS PROGRAM IS FOR A HOPFIELD NETWORK WITH A SINGLE LAYER OF 4 FULLY INTERCONNECTED NEURONS. THE NETWORK SHOULD RECALL THE PATTERNS 1010 AND 0101 CORRECTLY. nrn[0].weightv[0] is 0 nrn[0].weightv[1] is -3 nrn[0].weightv[2] is 3 nrn[0].weightv[3] is -3 activation is 3 output value is 1 nrn[1].weightv[0] is -3 nrn[1].weightv[1] is 0 nrn[1].weightv[2] is -3 nrn[1].weightv[3] is 3 activation is -6 output value is 0 nrn[2].weightv[0] is 3 nrn[2].weightv[1] is -3 nrn[2].weightv[2] is 0 nrn[2].weightv[3] is -3 activation is 3 output value is 1 nrn[3].weightv[0] is -3 nrn[3].weightv[1] is 3 nrn[3].weightv[2] is -3 nrn[3].weightv[3] is 0 activation is -6 output value is 0 pattern= 1 output = 1 component matches pattern= 0 output = 0 component matches pattern= 1 output = 1 component matches pattern= 0 output = 0 component matches nrn[0].weightv[0] is 0 nrn[0].weightv[1] is -3 nrn[0].weightv[2] is 3 nrn[0].weightv[3] is -3 activation is -6 output value is 0 nrn[1].weightv[0] is -3 nrn[1].weightv[1] is 0 nrn[1].weightv[2] is -3 nrn[1].weightv[3] is 3 activation is 3 output value is 1 nrn[2].weightv[0] is 3 nrn[2].weightv[1] is -3 nrn[2].weightv[2] is 0 nrn[2].weightv[3] is -3 activation is -6 output value is 0 nrn[3].weightv[0] is -3 nrn[3].weightv[1] is 3 nrn[3].weightv[2] is -3 nrn[3].weightv[3] is 0 activation is 3 output value is 1 pattern= 0 output = 0 component matches pattern= 1 output = 1 component matches pattern= 0 output = 0 component matches pattern= 1 output = 1 component matches

Let us recall our previous discussion of this example in Chapter 1. What does the network give as output if we present a pattern different from both A and B? If C = (0, 1, 0, 0) is the input pattern, the activation (dot products) would be –3, 0, –3, 3 making the outputs (next state) of the neurons 0,1,0,1, so that B would be recalled. This is quite interesting, because if we intended to input B, and we made a slight error and ended up presenting C instead, the network would recall B. You can run the program by changing the pattern to 0, 1, 0, 0 and compiling again, to see that the B pattern is recalled.

Another element about the example in Chapter 1 is that the weight matrix W is not the only weight matrix that would enable the network to recall the patterns A and B correctly. If we replace the 3 and –3 in the matrix with 2 and –2, respectively, the resulting matrix would facilitate the same performance from the network. One way for you to check this is to change the wt1, wt2, wt3, wt4 given in the program accordingly, and compile and run the program again. The reason why both of the weight matrices work is that they are closely related. In fact, one is a scalar (constant) multiple of the other, that is, if you multiply each element in the matrix by the same scalar, namely 2/3, you get the corresponding matrix in cases where 3 and –3 are replaced with 2 and –2, respectively.

Previous | Table of Contents | Next |

Copyright © IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic

ISBN: 1558515526

EAN: 2147483647

EAN: 2147483647

Year: 1995

Pages: 139

Pages: 139

Authors: Valluru B. Rao, Hayagriva Rao

Similar book on Amazon

flylib.com © 2008-2017.

If you may any questions please contact us: flylib@qtcs.net

If you may any questions please contact us: flylib@qtcs.net