Abstract | ||
---|---|---|
While region-based image alignment algorithms that use gradient descent can achieve sub-pixel accuracy when they converge, their convergence depends on the smoothness of the image intensity values. Image smoothness is often enforced through the use of multi-scale approaches in which images are smoothed and downsampled. Yet, these approaches typically use fixed smoothing parameters which may be appropriate for some images but not for others. Even for a particular image, the optimal smoothing parameters may depend on the magnitude of the transformation. When the transformation is large, the image should be smoothed more than when the transformation is small. Further, with gradient-based approaches, the optimal smoothing parameters may change with each iteration as the algorithm proceeds towards convergence.We address convergence issues related to the choice of smoothing parameters by deriving a Gauss-Newton gradient descent algorithm based on distribution fields (DFs) and proposing a method to dynamically select smoothing parameters at each iteration. DF and DF-like representations have previously been used in the context of tracking. In this work we incorporate DFs into a full affine model for region-based alignment and simultaneously search over parameterized sets of geometric and photometric transforms. We use a probabilistic interpretation of DFs to select smoothing parameters at each step in the optimization and show that this results in improved convergence rates. |
Year | DOI | Venue |
---|---|---|
2013 | 10.5244/C.27.17 | PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2013 |
Field | DocType | Citations |
Convergence (routing),Affine transformation,Magnitude (mathematics),Computer vision,Gradient descent,Parameterized complexity,Pattern recognition,Computer science,Smoothing,Artificial intelligence,Probabilistic logic,Smoothness | Conference | 2 |
PageRank | References | Authors |
0.43 | 16 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Benjamin Mears | 1 | 2 | 0.43 |
laura sevillalara | 2 | 141 | 7.06 |
Erik G. Miller | 3 | 1861 | 126.56 |