ENTROPY AND KULLBACK LEIBER DISTANCE

The Kullback Leiber distance measures the difference between statistical distributions (eg, histograms of an image pixel values),

DK( h | h') = - ∑i h(i) log( h'(i) / h(i) )

The Entropy is defined as the negative of the K-L distance, thus

H( h | h') = H(h) + ∑i h(i) log( h'(i) )
where H is the Boltzmann-Gibbs entropy
H( h ) = - ∑i h(i) log( h(i) )

The divergence is defined as

Div( h , h' ) = H( h | h' ) + H( h' | h )

Clearly the divergence is symmetric, Div(h,h')=Div(h',h), positive, Div(h,h')>=0, and it is additive (???).


Application: fine-coarse diffusion

We consider the image intensities I(x) as the statistical distribution.

The fine-coarse transformation of an image can be considered as a diffusion process in which the image intensities diffuse from one pixel to the next. This process is modeled by the heat equation (where t is the scale parameter)

d I(x;t) / dt = D² I(x;t)

Here is the 2-dim Lapalcian operator.

The fine-coarse transformation, is a semidynamical system with semigroup composition low. It is irreversible because information is lost and one cannot return from the coarse image to the fine-scale one.

The image intensity tends toward a uniform (mean) intensity, ie, I'(x)=const.. Thus, the derivative of the K-L entropy is

dH(I | I') / dt = dH(I) / dt

since x I(x) = const.. The entropy production is the loss of information. Intuitively it is a global measure of the rate at which the image loses structure during the fine-coarse transformation.

By using the Boltzmann-Gibbs formula and the heat equation, we can write

dH(I | I') / dt = - ∑x log( I(x;t) ) D² I(x;t)

Applying Gauss divergence theorem (integration by parts), and assuming that there is no flux across the image boundaries (ie, the total amount of image intensity is conserved), we arrive at

dH(I | I') / dt = ∑x I(x;t) s²(x,t)

where s(x;t) = DI(x;t) / I(x;t), can be interpreted as the local entropy production rate.


Marco Corvi - Page hosted by geocities.com.