next up previous contents
Next: Measures of image quality Up: Introduction Previous: Global iterative approaches

The importance of priors

A useful analogy may be drawn between lossless compression and image restoration. In lossless compression, compression of some signals is achieved only at the expense of making other signals longer. One has to design the algorithm so that commonly occurring signals are compressed, and uncommon signals are lengthened. Similarly in image restoration, improvement in some images is obtained by worsening others. To ensure that typical images are improved, the image restoration scheme must incorporate prior knowledge of the statistical properties of the target class of images.

These priors are crucial to achieving a good restoration result. The success of a filter depends primarily on the accuracy of its priors. Thus order statistic filters, which make wildly unrealistic assumptions about the signal (namely, that it consists entirely of flat regions) have comparatively poor performance. Wiener filters, which implicitly assume Gaussian priors, generally do a bit better. Lee's filter assumes a mixture of flat and detail regions, which is more realistic. But by far the best results are achieved by filters which learn the priors, rather than assuming them. Such filters are able to exploit very detailed, accurate knowledge of the signal statistical properties. Examples include Gauss-Markov Random Fields, Vector Quantization, Neural Networks and the Grid Filters developed in this thesis.



Todd Veldhuizen
Fri Jan 16 15:16:31 EST 1998