- ...sense).
- A
random process is second-order stationary if the expected
value of any quadratic function of the process random variables
is invariant under shifting. This guarantees that the autocorrelation
is well-defined.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ... .
- This is because
each iteration requires operations, and at least O(N) iterations
are required for information to traverse the length of the image.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ... .
- Blurring of course relies on
pixels which lie outside a local window. However, this does not pose
a problem: the dependence on other pixels can be integrated out. If
the blurring depends on a set of pixels which is a
superset of , it is simple to express
in terms of and .
Naturally, information from pixels outside the 3x3 window useful in restoring
the central pixel is lost.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...distance
- Measured
using expected squared difference
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...equations.
- Note that although it is possible to
create polynomials bases which are orthogonal under uniform
measure (e.g. Legendre polynomials), creating such bases which
are orthogonal under an unknown density function
is impossible.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...samples.
- A typical image contains
about 0.25 million pixels. It does not take many training images
to get millions of training samples.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...price.
- There is a slight decrease in
filtering time as the number of coefficients increases, due to
cache effects.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...term.
-
Approximating the number of grid points using the volume of
the hypersphere works well for a small number of dimensions.
However, at a certain critical dimension, hyperspheres stop
growing in volume and start shrinking. This is because
is due to
the term in the denominator. However, the number
of grid points does not shrink, but behaves asymptotically
as , where and L is
the extent of the grid in each dimension. This is still
much better than the full hypercube grid, in which the
number of grid points behaves asymptotically as
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...coefficients.
-
Without the sparse representation and symmetry reductions, this filter
would require 7355827511386641 (about 7x169#52) coefficients.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...corners.
- The
keen reader will note that the division of the square into two triangles
(Figure 2.11) can be done in two ways; rather than drawing the
diagonal line from to , it could be drawn from to
. This type of interpolation introduces an anisotropy: the
basis functions have a definite orientation to them. By sacrificing
isotropy, substantial computational gains are made.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...satisfy.
- These properties are
- Associativity: (ab)c = a(bc)
- Identity: There is an identity element e such that ae = ea = a
for all a.
- Inverses: For every element a, there is an inverse of a such
that .
- Closure: For every a and b in the group, ab is also in the group.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...filters,
- Under the mild assumption that
274#79 for almost all applications
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...filters
- For
noise with symmetric distributions only
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...updates.
- A rank-1 update of a matrix
has the form .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...costs.
- Note: the system of equations is
typically large enough that storing as a dense matrix is
impossible. Some filters described consisted of 16000 grid points, which
would require about 2 Gb of RAM were a dense matrix representation
used. A sparse matrix representation of the same matrix fits comfortably
into 64 Mb of RAM.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...unpredictable.
- This is what
happens with polynomial filters and outliers: polynomial approximations
can have wild oscillations outside the region of training samples, and
run off to plus or minus infinity.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
- ...signal.
- This is because a 3x3 window
contains 9 observations; the residual variance is therefore
800/9=88.89.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.