Skip to content

Labwork 8

Image registration is a commonly applied procedure for transforming different sets of image data into a common coordinate system. Such data may result from different imaging systems (MRI, Computed Tomography, Photographs), acquisition times, or viewpoints. Registration is used in a wide range of applications to be able to compare or integrate the different data. In Medical Imaging an often occurring issue is that the image data has varying contrast, due to acquisition differences in modalities (e.g. CT versus MRI) or since image contrast does not reproduce well (MRI). To cope with this, images are often registered by maximizing the so-called mutual information between images. Mutual information is a metric based on the concept of Shannon’s entropy.

Problem 1: Shannon’s entropy

The concept of information entropy was introduced by Claude Shannon in his 1948 paper “A Mathematical Theory of Communication”. Essentially, this entropy measure expresses the degree of uncertainty in a signal. In other words, the wider the range of signal values, the higher the entropy of the signal; conversely, a constant signal has entropy 0.

Shannon’s entropy (H(.)) of a 2D image f:\mathbb{R}^2\rightarrow\mathbb{R} is expressed through:

H(f) = - \sum_i p(f(x,y) = i)\log(p(f(x,y) = i))

in which p(.) is the probability that a particular value (f(x,y)=i) occurs. The entropy of an image is usually measured based on the normalized histogram. This is constructed by simply dividing the histogram entries by the total number of pixels, so that is can be conceived as a probability distribution. As an example, consider the 1D signal:

f(x) = [ 0\ 0\ 0\ 3\ 4\ 2\ 5\ 3\ 0\ 0\ 0]

This yields the normalized histogram:

\begin{array}{} &0 &1 &2 &3 &4 &5 &6 &7 \\ p(f(x)) = &[6/11 &0/11 &1/11 &2/11 &1/11 &1/11 &0/11 &0/11] \end{array}

From the histogram the entropy of the signal is calculated as:

H(f) = -1\left(\frac{6}{11}\log\left(\frac{6}{11}\right)\right) - 3\left(\frac{1}{11}\log\left(\frac{1}{11}\right)\right) - 1\left(\frac{2}{11}\log\left(\frac{2}{11}\right)\right) = 0.56

ToDo 1

  • Create a function to calculate Shannon’s Entropy.
  • Create an image in which one half of the signal values are 0 and the other half is 255.
  • Use your function to calculate the entropy of this signal. Is it what you would expect?
  • Use the function also to calculate the entropy of image body1
  • Apply a smoothing filter to the image (e.g. Gaussian or Kuwahara) and recalculate the entropy. Can you explain the difference?

Problem 2: Mutual Information

The Mutual Information (MI) of two signals essentially measures how well the second signal can be predicted by knowing the first on a pixel by pixel level. Intuitively speaking, it reflects the information that the two signals share per pixel. In other words, MI quantifies how much knowing one signal reduces the uncertainty about the other. For example, if the values in the two signals are completely independent, then knowing one signal does not give any information about the other. As such their mutual information is zero. At the other extreme, if one signal is a deterministic function of the other (so that the one is uniquely determined by the other) then the mutual information is very high.

The mutual information of two 2D images is defined as:

I(f,g) = H(f) + H(g) - H(f,g)

Here H(f) and H(g) are the Shannon entropies of the signals (as above) and H(f,g) is the joint entropy, given by:

H(f,g) = \sum_{i,j}p(f(x,y) = i,g(x,y) = j) \cdot \log(p(f(x,y) = i,g(x,y) = j))

In which p(f(x,y) = i,g(x,y) = j) is the joint probability distribution, represented by the normalized, joint histogram. It is determined by counting the number of times intensities i and j occur simultaneously (at the same pixel location) across the two images and dividing this number by the total number of pixels in either image.

As a second example, consider the two (1D) signals:

\begin{array}{} f(x) = &[0 &0 &0 &3 &4 &2 &5 &3 &0 &0 &0]\\ g(x) = &[0 &4 &3 &2 &5 &4 &0 &0 &0 &0 &0] \end{array}

In these signals one combination of intensities (0,0) occurs four times, so that the associated probability is p(0,0) = 4/11; seven other combinations occur once, so that e.g. p(0,4)=1/11. In effect, the joint entropy calculated from the normalized joint histogram for the given signals equals:

H(f,g) = \sum_{i,j}p(i,j)\log(p(i,j)) = -\left(\frac{4}{11}\log\left(\frac{4}{11}\right)\right) + 7\left(\frac{1}{11}\log\left(\frac{1}{11}\right)\right) = 0.82

ToDo 2

  • Create a function for calculating the joint entropy and subsequently a function for calculating the Mutual Information.
  • Create an image f in which there are four pixel values: 0, 63, 127 and 255, each with equal amounts of pixels. Create a second image g by mapping the pixels from f as follows: m(0)=127; m(63)=255; m(127)=63; m(255)=127.
  • Use the function that you created to calculate the Mutual Information of these images. Is it what you would expect?
  • Create an image h with random noise; rotate this image by 90 degrees; now use your function again to calculate the mutual information. Is it once more what you would expect?

ToDo 3

  • Read the images body1 and body5 and determine the difference image. (To quantify the difference you might determine the sum of squared differences of the two images.)
  • Also, read image body2 and compute the difference with body1.
  • Determine the mutual information of images body1 and body5 as well as of the images body1 and body2. Is it what you would expect?
  • Apply a smoothing filter to all images (e.g. Gaussian or Kuwahara) and recalculate the mutual information. Can you explain the difference?

Files

You need to download the following images for this labwork (no script is provided anymore for the last labwork)..


Last update: 2023-04-04