9.

[Cover] [Contents] [Index]

Page 105

(3.2)

T is a parameter called temperature (normally T=1; a more extensive discussion of the effect of T is given in Section 6.5) that renders the sigmoidal curve more abrupt (T<1) or gradual (T>1). The corresponding pattern is illustrated in Figure 3.1a. After the activity of each neurone within the same layer has been computed, a similar process is carried out on the next adjacent layer. Note that the input layer is a special case because the neurones in the input layer take the values provided by the training sample. For nodes in the input layer, the activity for neurone j is directly set as the jth component of the input pattern vector.

Sometimes an extra neurone, called the bias, is added to the neural network. It links to all the network layers, except the input layer. This bias unit has a constant activity of 1, but affects each neurone j via different weight values (e.g. θj in Figure 3.2). If the bias unit is introduced, Equation (3.1) is modified as follows:

(3.3)

where θj is the bias associated with the neurone j. The effect of the bias term θ is to contribute to a left- or right-wards shifting of the sigmoid mapping function, as shown in Figure 3.1b, depending on whether the value θ is negative or positive. The introduction of a bias unit to the network is believed to improve the convergence property of a multilayer perceptron neural network. An example of forward propagation is shown graphically in Figure 3.2.

In backward propagation, the inter-neurone weights are modified starting from the output layer until the input layer is reached, that is, moving left from the rightmost layer in terms of the layout shown in Figure 2.10. The aim of weight updating is to reduce the identification error of the network. Generally, the least mean square error criterion is applied. This is defined as:

(3.4)

where w is a set of weights in a network, aj,k is the jth neurone in the output layer obtained from the kth training sample, and oj,k is the target output at neurone j in the output layer for kth training sample. A number

[Cover] [Contents] [Index]


Classification Methods for Remotely Sensed Data
Classification Methods for Remotely Sensed Data, Second Edition
ISBN: 1420090720
EAN: 2147483647
Year: 2001
Pages: 354

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net