C++ Neural Networks and Fuzzy Logic by Valluru B. Rao M&T Books, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 |

Previous | Table of Contents | Next |

We have four neurons in the only layer in this network. We need to compute the activation of each neuron as the weighted sum of its inputs. The activation at the first node is the dot product of the input vector and the first column of the weight matrix (0 -3 3 -3). We get the activation at the other nodes similarly. The output of a neuron is then calculated by evaluating the threshold function at the activation of the neuron. So if we present the input vector **A**, the dot product works out to 3 and f(3) = 1. Similarly, we get the dot products of the second, third, and fourth nodes to be –6, 3, and –6, respectively. The corresponding outputs therefore are 0, 1, and 0. This means that the output of the network is the vector (1, 0, 1, 0), same as the input pattern. The network has recalled the pattern as presented, or we can say that pattern **A** is stable, since the output is equal to the input. When **B** is presented, the dot product obtained at the first node is –6 and the output is 0. The outputs for the rest of the nodes taken together with the output of the first node gives (0, 1, 0, 1), which means that the network has stable recall for **B** also.

NOTE:In Chapter 4, a method of determining the weight matrix for the Hopfield network given a set of input vectors is presented.

So far we have presented easy cases to the network—vectors that the Hopfield network was specifically designed (through the choice of the weight matrix) to recall. What will the network give as output if we present a pattern different from both **A** and **B**? Let **C** = (0, 1, 0, 0) be presented to the network. The activations would be –3, 0, –3, 3, making the outputs 0, 1, 0, 1, which means that **B** achieves stable recall. This is quite interesting. Suppose we did intend to input **B** and we made a slight error and ended up presenting **C**, instead. The network did what we wanted and recalled **B**. But why not **A**? To answer this, let us ask is **C** closer to **A** or **B**? How do we compare? We use the distance formula for two four-dimensional points. If (a, b, c, d) and (e, f, g, h) are two four-dimensional points, the distance between them is:

[radic][(a – e)2 + (b – f)2 + (c – g)2 + (d – h)2 ]

The distance between **A** and **C** is [radic]3, whereas the distance between **B** and **C** is just 1. So since **B** is closer in this sense, **B** was recalled rather than **A**. You may verify that if we do the same exercise with **D** = (0, 0, 1, 0), we will see that the network recalls **A**, which is closer than **B** to **D**.

When we talk about closeness of a bit pattern to another bit pattern, the *Euclidean distance* need not be considered. Instead, the *Hamming distance* can be used, which is much easier to determine, since it is the number of bit positions in which the two patterns being compared differ. Patterns being strings, the Hamming distance is more appropriate than the Euclidean distance.

NOTE:The weight matrixWwe gave in this example is not the only weight matrix that would enable the network to recall the patterns A and B correctly. You can see that if we replace each of 3 and –3 in the matrix by say, 2 and –2, respectively, the resulting matrix would also facilitate the same performance from the network. For more details, consult Chapter 4.

The Hopfield network is a *recurrent* network. This means that outputs from the network are fed back as inputs. This is not apparent from Figure 1.3, but is clearly seen from Figure 1.4.

**Figure 1.4** Feedback in the Hopfield network.

The Hopfield network always stabilizes to a fixed point. There is a very important detail regarding the Hopfield network to achieve this stability. In the examples thus far, we have not had a problem getting a stable output from the network, so we have not presented this detail of network operation. This detail is the need to update the network *asynchronously*. This means that changes do not occur simultaneously to outputs that are fed back as inputs, but rather occur for one vector component at a time. The true operation of the Hopfield network follows the procedure below for input vector **Invec** and output vector **Outvec**:

**1.**Apply an input, Invec, to the network, and initialize**Outvec = Invec****2.**Start with i = 1**3.**Calculate Value_{i}= DotProduct ( Invec_{i,}Column_{i}of Weight matrix)**4.**Calculate**Outvec**=_{i}**f**(**Value**_{i}) where**f**is the threshold function discussed previously**5.**Update the input to the network with component**Outvec**_{i}**6.**Increment i, and repeat steps 3, 4, 5, and 6 until**Invec**=**Outvec**(note that when i reaches its maximum value, it is then next reset to 1 for the cycle to continue)

Now let’s see how to apply this procedure. Building on the last example, we now input **E** = (1, 0, 0, 1), which is at an equal distance from **A** and **B.** Without applying the asynchronous procedure above, but instead using the shortcut procedure we’ve been using so far, you would get an output **F** = (0, 1, 1, 0). This vector, **F**, as subsequent input would result in **E** as the output. This is incorrect since the network oscillates between two states. We have updated the entire input vector synchronously.

Now let’s apply asynchronous update. For input **E**, (1,0,0,1) we arrive at the following results detailed for each update step, in Table 1.1.

Step | i | Invec | Column of Weight vector | Value | Outvec | notes |
---|---|---|---|---|---|---|

0 | 1001 | 1001 | initialization : set Outvec = Invec = Input pattern | |||

1 | 1 | 1001 | 0 -3 3 -3 | -3 | 0001 | column 1 of Outvec changed to 0 |

2 | 2 | 0001 | -3 0 -3 3 | 3 | 0101 | column 2 of Outvec changed to 1 |

3 | 3 | 0101 | 3 -3 0 -3 | -6 | 0101 | column 3 of Outvec stays as 0 |

4 | 4 | 0101 | -3 3 -3 0 | 3 | 0101 | column 4 of Outvec stays as 1 |

5 | 1 | 0101 | 0 -3 3 -3 | -6 | 0101 | column 1 stable as 0 |

6 | 2 | 0101 | -3 0 -3 3 | 3 | 0101 | column 2 stable as 1 |

7 | 3 | 0101 | 3 -3 0 -3 | -6 | 0101 | column 3 stable as 0 |

8 | 4 | 0101 | -3 3 -3 0 | 3 | 0101 | column 4 stable as 1; stable recalled pattern = 0101 |

Previous | Table of Contents | Next |

Copyright © IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic

ISBN: 1558515526

EAN: 2147483647

EAN: 2147483647

Year: 1995

Pages: 139

Pages: 139

Authors: Valluru B. Rao, Hayagriva Rao

Similar book on Amazon

flylib.com © 2008-2017.

If you may any questions please contact us: flylib@qtcs.net

If you may any questions please contact us: flylib@qtcs.net