next up previous contents
Next: The Hough Transform Up: Global Optimization Previous: Genetic Algorithms

Multi-Resolution Methods and Graduated Non-Convexity

Often a function surface can be very un-smooth, having many sharp local minima, making it hard to find the overall global minimum (Figure 4). For instance, the response of a correllation mask over a noisy image tends to be noisy - the only way to be sure to locate the best match is to search through every pixel location. However, in some situations it is much easier to locate the minimum of a smoothed version of the function surface, which can then give a good starting point to locate the minimum of the original function. Notice that it is very inefficient to generate the function surface and then smooth it - far too many function evaluations would be required. Instead we need a new function, based on the original, which will generate a smoother surface with the major minima in similar locations to the original. In the correlation example this can be achieved by smoothing (and possibly sub-sampling) both the mask and the target image. Correlating the smoothed mask over the smoothed image will tend to give a cleaner response with minima close to the main minima of the original mask. In this case care must be taken that the desired global minimum hasn't been smoothed out of existence, which can happen if the important mask features have a high spatial frequency.

Multi-Resolution methods work by applying an algorithm to smoothed versions of an image (or data set). The result at one level of smoothing is used to seed the algorithm working with less smoothing, repeating until the solution is found on the unsmoothed original image [18].

  
Figure 4: Smoothing a function can help find the minima.

Blake and Zisserman have used a similar approach in the Graduated Non-Convexity (GNC) algorithm [19]. A function is said to be convex over a region R if, for two points , the following holds

Geometrically, in the n+1 dimensional space the line segment joining the points and never goes below the function surface .

Such a function will have at most one minimum, which can be found by applying a local minimizer to any starting point in R.

In the GNC method a non-convex function (having many local minima) is approximated by a convex function , which has its single minima near to the global minima of . If the minima do not exactly coincide a series of intermediate functions with is defined, where and . A minimum of is used as the starting point to locate the minimum of , , . (The name of the algorithm comes from the gradual loss of convexity as decreases). Details are given in [19].



next up previous contents
Next: The Hough Transform Up: Global Optimization Previous: Genetic Algorithms



Bob Fisher
Fri Mar 28 14:12:50 GMT 1997