199.

[Cover] [Contents] [Index]

Page 277

7.3.2 Bayesian multisource classification mechanism

The model described in this section is based on Lee et al. (1987). In the general multisource case, a set of observations from n (n>1) different sources is used. Let xi, i [1, n], denote the measure of a specific pixel from measurement source i. The aim is to derive the probability of the pixel belonging to each of c information classes ωj, j [1, c]. It is possible to get some information about the prior probability of class ωj, denoted by Pj), i.e. the probability that an observation will be a member of class ωj. A useful model for prior probability is context, as described in Chapter 6. The approach using contextual assumption to perform multisource classification is considered later.

According to Bayesian probability theory, the law of conditional probabilities states that:

(7.1)

where P(ωj| x1, x2…, xn) is known as the conditional or posterior probability that ωj is the correct class, given the observed data vector (x1, x2, …, xn). P(x1, x2,…, xnj) is the probability density function (p.d.f.) associated with the measured data (x1, x2,…, xn) given that (x1, x2,…, xn) are members of class ωj. P(x1, x2,…, xn) is the p.d.f. of data (x1, x2,…, xn). Assuming class conditional independence among the sources, one obtains P(x1, x2,…, xnj)=P(x1j)·P(x2j)·…·P(xnj), and Equation (7.1) becomes:

(7.2)

Again, following the law of conditional probabilities,

(7.3)

If Equation (7.3) is substituted into Equation (7.2), one obtains:

(7.4)

If the intersource independence assumption has been made such that P(x1)·P(x2)·…·P(x1)=P(x1, x2,…, xn), then Equation (7.4) results in:

[Cover] [Contents] [Index]


Classification Methods for Remotely Sensed Data
Classification Methods for Remotely Sensed Data, Second Edition
ISBN: 1420090720
EAN: 2147483647
Year: 2001
Pages: 354

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net