next up previous
Next: Example of calculation of Up: Normalized Convolution: A Tutorial Previous: Normalized Convolution: A Tutorial

Introduction

Many fields of scientific research benefit or require the analysis of irregularly sampled data. The analysis of irregularly sampled data is more complicated than the analysis of regularly spaced ones. It is often required to reconstruct the irregularly sampled signal or resample it onto a regular grid.

This can be done by the so-called direct methods, which involve the computation of the Fourier transform for irregularly sampled data and the computation of the inverse discrete Fourier transform in order to obtain a regularly spaced signal back.

Then there are non-direct methods which use interpolation of the irregularly sampled signal in order to obtain a regularly sampled signal. The missing values of the signal are calculated by interpolation and then used to process the signal using conventional mathematical tools. Interpolation is usually done by convolution. The convolution can be made more effective by a normalization operation that takes into account the possibility of missing samples.

In Knutsson and Westin [1] the problem of the image analysis when performed on irregularly sampled image data is considered under the theory of signal and its certainty. This is to consider the separation of both data and operator applied to the data in a signal part and a certainty part. Missing data in irregularly sampled series is handled by setting the certainty of the data equal to zero. In the case of uncertain data, an estimate of certainty accompanies the data, and this can be used in a probabilistic framework. The theory that they developed following these ideas is called Normalized Convolution.



Bob Fisher
Sun Mar 9 21:02:14 GMT 2003