next up previous
Next: Non-orthogonal moments Up: Statistical moments - An Previous: Statistical moments - An


The moment generating function

To describe the distribution of a random variable the characteristic function can be used [10]:

\begin{displaymath}
X(w) = \int_{-\infty}^{\infty} f(x) \exp(jwx)~dx = E[\exp(jwx)] %%\sum_E P(E)\exp(iwE)
\end{displaymath} (4)

shown here for the signal density $f(x)$, where $j=\sqrt{-1}$ and $w$ is the spatial frequency. This is essentially the Fourier transform of the signal and has a maximum at the origin $w=0$, as $f(x)\geq0$. Figure 1.1 shows an example of $X(w)$ for a zero mean, unit variance Gaussian density $f(x)$.

Figure 1.1: Characteristic function of a Gaussian density $f(x)$.
\rotatebox{-0}{\scalebox{1.0}{\includegraphics{images/theory/characteristic_gaussian.ps}}}

If $f(x)$ $f(x)$One dimensional continuous functionis the density of a positive, real valued random variable $x$, such that $x \epsilon \mathbb{R}$, then a continuous exponential distribution can be defined. Replacing $jw$ in Equation 1.4 with $s$ produces a real valued integral of the form:
$\displaystyle M^x(s) = \int_{-\infty}^{\infty} f(x) \exp(xs)~dx = E[\exp(xs)]$     (5)

where $E[.]$ is the expectation and $M^x(s)$ exists as a real number. $M^x(s)$ is called the moment generating function, shown here for a one-dimensional distribution. It is used to characterise the distribution of an ergodic signal. Expressing the exponential in terms of an expanded Taylor series produces:
\begin{displaymath}
\exp(xs) = \sum_{n=0}^{\infty} \frac{x^ns^n}{n!}= 1 + xs + \frac{1}{2!}x^2s^2 + ... + R_n(x)
\end{displaymath} (6)

where $R_n(x)$ is the error term. It can be seen that the series will only converge and represent $x(s)$ completely if $R_n(x)=0$. Therefore, if the distribution is finite in length, all values outside this length must be zero (or in terms of an image, all values outside the sampled image plane must be zero). Assuming this and substituting Equation 1.6 into Equation 1.5 produces:
$\displaystyle M^x(s)$ $\textstyle =$ $\displaystyle \int_{-\infty}^{\infty} f(x)\exp(xs)~dx$  
  $\textstyle =$ $\displaystyle \int_{-\infty}^{\infty} (1 + xs + \frac{1}{2!}x^2s^2 + ...) f(x)~dx$  
  $\textstyle =$ $\displaystyle 1 + sm_1 + \frac{1}{2!}s^2m_2 + ...,$ (7)

where $m_n$ is the $n^{th}$ moment about the origin. Differentiating Equation 1.5 $n$ times with respect to $s$ produces:
\begin{displaymath}
M^x_n(s) = E[x^n\exp(xs)]
\end{displaymath} (8)

If $M^x(s)$ is differentiable at zero, then the $n^{th}$ order moments about the origin are given by:
\begin{displaymath}
M^x_n(0) = E[x^n] = m_n
\end{displaymath} (9)

So the first three moments of this distribution are:

$\displaystyle M^x_0(s)$ $\textstyle =$ $\displaystyle E[\exp(xs)] ~~;~~ M^x_0(0) = 1$  
$\displaystyle M^x_1(s)$ $\textstyle =$ $\displaystyle E[x\exp(xs)] ~~;~~ M^x_1(0) = x$ (10)
$\displaystyle M^x_2(s)$ $\textstyle =$ $\displaystyle E[x^2\exp(xs)] ~~;~~ M^x_2(0) = x^2$  

If the distribution of the signal is a Gaussian, then it is completely described by its two moments, mean ($M^x_1(0)$) and variance ( $M^x_2(0)- \left( M^x_1(0) \right)^2$), while the total area ($M^x_0(0)$) is $1$. If the joint moment $M^{xy}(s)$ for two signals is required (i.e. a two-dimensional image) then it is noted that:
\begin{displaymath}
M^{xy}(s) = E[ \exp((x+y)s)] = E[ \exp(xs)\exp(ys)] %%\nonumber \\
\end{displaymath} (11)

and assuming that $x$ and $y$ are independent, then:
\begin{displaymath}
M^{xy}(s) = E[ \exp(xs)] E[ \exp(ys)] = M^x(s)M^y(s)
\end{displaymath} (12)

In conclusion, it is possible to evaluate the moments of a distribution by two methods. Either by using the direct integration (Equation 1.1), or by use of the moment generating function (Equation 1.5). However, in practice the moment generating function is more widely applied to the problem of calculating moment invariants, while the direct integration method is used to calculate specific moment values.


next up previous
Next: Non-orthogonal moments Up: Statistical moments - An Previous: Statistical moments - An
Jamie Shutler 2002-08-15