C++ Neural Networks and Fuzzy Logic by Valluru B. Rao M&T Books, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 |

Previous | Table of Contents | Next |

If you desire to use vectors with real number components, instead of binary numbers, you can do so. Your model is then called a *Continuous Bidirectional Associative Memory*. A matrix **W** and its transpose are used for the connection weights. However, the matrix **W** is not formulated as we described so far. The matrix is arbitrarily chosen, and kept constant. The **thresholding** function is also chosen as a continuous function, unlike the one used before. The changes in activations of neurons during training are according to extensions the Cohen-Grossberg paradigm.

Michael A. Cohen and Stephen Grossberg developed their model for autoassociation, with a symmetric matrix of weights. Stability is ensured by the Cohen-Grossberg theorem; there is no learning.

If, however, you desire the network to learn new pattern pairs, by modifying the weights so as to find association between new pairs, you are designing what is called an *Adaptive Bidirectional Associative Memory* (*ABAM*).

The law that governs the weight changes determines the type of ABAM you have, namely, the *Competitive* ABAM, or the *Differential Hebbian* ABAM, or the *Differential Competitive* ABAM. Unlike in the ABAM model, which is additive type, some products of outputs from the two layers, or the derivatives of the threshold functions are used in the other models.

Here we present a brief description of a model, which is a variation of the BAM. It is called *UBBAM* (*Unipolar Binary Bidirectional Associative Memory*).

T. C. B. Yu and R. J. Mears describe a design of unipolar binary bidirectional associative memory, and its implementation with a Smart Advanced Spatial Light Modulator (SASLM). The SASLM device is a ferroelectric liquid crystal spatial light modulator. It is driven by a silicon CMOS backplane. We use their paper to present some features of a unipolar binary bidirectional associative memory, and ignore the hardware implementation of it.

Recall the procedure by which you determine the weight matrix **W** for a BAM network, as described in the previous pages. As a first step, you convert each vector in the exemplar pairs into its bipolar version. If **X** and **Y** are an exemplar pair (in bipolar versions), you take the product **X ^{T} Y** and add it to similar products from other exemplar pairs, to get a weight matrix

Suppose you choose two pairs of vectors as possible exemplars. Let them be,

X_{1}= (1, 0, 0, 1), Y_{1}= (0, 1, 1) and X_{2}= (0, 1, 1, 0), Y_{2}= (1, 0, 1)

These you change into bipolar components and get, respectively, (1, –1, –1, 1), (–1, 1, 1), (–1, 1, 1, –1), and (1, –1, 1). The calculation of *W* for the BAM was done as follows.

1 [-1 1 1] -1 [1 -1 1] -1 1 1 -1 1 -1 -2 2 0 W = -1 + 1 = 1 -1 -1+1 -1 1 = 2 -2 0 -1 1 1 -1 -1 1 -1 1 2 -2 0 1 -1 -1 1 1 -1 1 -1 -2 2 0

and

-2 2 2 -2 W^{T}= 2 -2 -2 2 0 0 0 0

You see some negative numbers in the matrix ** W**. If you add the constant

0 4 2 W^{~}= 4 0 2 4 0 2 0 4 2

You need a modification to the thresholding function as well. Earlier you had the following function.

1 if y_{j}> 0 1 if x_{i}> 0 b_{j}|_{t+1}= b_{j}|_{t}if y_{j}= 0 and a_{i}|_{t+1}= a_{i}|_{t}if x_{i}= 0 0 if y_{j}< 0 0 if x_{i}< 0

Now you need to change the right-hand sides of the conditions from 0 to the product of **m** and sum of the inputs of the neurons, that is to **m** times the sum of the inputs. For brevity, let us use **S _{i}** for the sum of the inputs. This is not a constant, and its value depends on what values you get for neuron inputs in any one direction. Then the

1 if y_{j}> m S_{i}1 if x_{i}> m S_{i}b_{j}|_{t+1}= b_{j}|_{t}if y_{j}= m S_{i}and a_{i}|_{t+1}= a_{i}|_{t}if x_{i}= m S_{i}0 if y_{j}< m S_{i}0 if x_{i}< m S_{i}

For example, the vector **X _{1}** = (1, 0, 0, 1) has the activity vector (0, 8, 4) and the corresponding output vector is (0, 1, 1), which is,

Previous | Table of Contents | Next |

Copyright © IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic

ISBN: 1558515526

EAN: 2147483647

EAN: 2147483647

Year: 1995

Pages: 139

Pages: 139

Authors: Valluru B. Rao, Hayagriva Rao

Similar book on Amazon

flylib.com © 2008-2017.

If you may any questions please contact us: flylib@qtcs.net

If you may any questions please contact us: flylib@qtcs.net