The Kullback Leiber distance measures the difference between statistical distributions (eg, histograms of an image pixel values),
The Entropy is defined as the negative of the K-L distance, thus
The divergence is defined as
Clearly the divergence is symmetric, Div(h,h')=Div(h',h), positive, Div(h,h')>=0, and it is additive (???).
Application: fine-coarse diffusion
We consider the image intensities I(x) as the statistical distribution.
The fine-coarse transformation of an image can be considered as a diffusion process in which the image intensities diffuse from one pixel to the next. This process is modeled by the heat equation (where t is the scale parameter)
Here D² is the 2-dim Lapalcian operator.
The fine-coarse transformation, is a semidynamical system with semigroup composition low. It is irreversible because information is lost and one cannot return from the coarse image to the fine-scale one.
The image intensity tends toward a uniform (mean) intensity, ie, I'(x)=const.. Thus, the derivative of the K-L entropy is
since ∑x I(x) = const.. The entropy production is the loss of information. Intuitively it is a global measure of the rate at which the image loses structure during the fine-coarse transformation.
By using the Boltzmann-Gibbs formula and the heat equation, we can write
Applying Gauss divergence theorem (integration by parts), and assuming that there is no flux across the image boundaries (ie, the total amount of image intensity is conserved), we arrive at
where s(x;t) = DI(x;t) / I(x;t), can be interpreted as the local entropy production rate.
Marco Corvi - Page hosted by geocities.com.