24.

[Cover] [Contents] [Index]

Page 119

using Equation (3.16). If it does not match, the corresponding weights are moved apart from the input feature using Equation (3.17):

(3.16)

(3.17)

where δn is a gain term, which may decrease with time. The value of δn has to be chosen in the range (0, 1) in order to guarantee that a convergence state can be obtained.

This weight modification mechanism can be further explained in terms of a simple example. Suppose that the weight linking to the winning neurone a has the value 30, and the input feature value is 25. According to the steps introduced above, if the winning neurone a matches the desired output for the current input value then the weight should be adjusted in order to make it close to the input value. Let the gain term δn be 0.01 so, from Equation (3.16), the new weight will be [30+0.01(25–30)]=29.95, which is closer to the input value 25 and thus will enhance the probability that in the next iteration for the same input value of 25 the desired neurone a is again likely to become the winning neurone. If neurone a is not our desired output then, according to Equation (3.17), the new weight will become [30 −0.01(25−30)]=30.05, which is further away from the input value 25, and thus the probability that the undesired neurone a wins again in the next iteration for the same input value of 25 is reduced.

As pointed out above, the purpose of supervised training of the SOM is to label the clusters formed at the unsupervised training stage. Such a procedure should not be confused with the unsupervised classification methods described in Chapter 2. Although traditional unsupervised classification methods also embody a two-stage process comprising clustering and labelling, the labelling stage contributes nothing to the location of the clusters in feature space. It is an identification phase. However, the supervised training algorithm for the SOM updates the cluster location as defined by Equations (3.16) and (3.17).

3.2.2 Examples of self-organisation

Self-organisation is an interesting characteristic of a SOM network. It implies that the network can detect relationships between the input patterns then arrange and express such relationships through the network weights. The process of self-organisation consists of two steps, those of detecting relationships and spanning. During the early stages of training, the SOM focuses on finding the ordering of input patterns. Once the inter-pattern relationship is found, then the SOM will span its output neurones to make each neurone respond to approximately the same number of input patterns.

[Cover] [Contents] [Index]


Classification Methods for Remotely Sensed Data
Classification Methods for Remotely Sensed Data, Second Edition
ISBN: 1420090720
EAN: 2147483647
Year: 2001
Pages: 354

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net