Access the full text.
Sign up today, get DeepDyve free for 14 days.
A. Blake, M. Isard (2000)
Active Contours: The Application of Techniques from Graphics, Vision, Control Theory and Statistics to Visual Tracking of Shapes in Motion
T. Chin, W. Karl, A. Willsky (1994)
Probabilistic and sequential computation of optical flow using temporal coherenceIEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 3 6
P. Giaccone, Graeme Jones (1997)
Feed-forward estimation of optical flow, 1
P. Burt, E. Adelson (1983)
The Laplacian Pyramid as a Compact Image CodeIEEE Trans. Commun., 31
Michael Elad, A. Feuer (1998)
Recursive Optical Flow Estimation - Adaptive Filtering ApproachJ. Vis. Commun. Image Represent., 9
N. Namazi, P. Penafiel, C. Fan (1994)
Nonuniform image motion estimation using Kalman filteringProceedings of ICASSP '94. IEEE International Conference on Acoustics, Speech and Signal Processing, v
Berthold Horn, B. Schunck (1981)
Determining Optical Flow, 0281
J. Barron, David Fleet, S. Beauchemin (1992)
Performance of optical flow techniquesInternational Journal of Computer Vision, 12
D. Robinson, P. Milanfar (2004)
Fast Local and Global Projection-Based Methods for Affine Motion EstimationJournal of Mathematical Imaging and Vision, 18
J. Bergen, P. Anandan, K. Hanna, R. Hingorani (1992)
Hierarchical Model-Based Motion Estimation
E. Hildreth (1984)
Computations Underlying the Measurement of Visual MotionArtif. Intell., 23
P. Zarchan, H. Musoff (2001)
Fundamentals of Kalman Filtering: A Practical Approach
Purpose – This paper seeks to develop a reliable and computationally efficient method for estimating and predicting large‐amplitude optical flows via taking into consideration their coherence along the time dimension. Design/methodology/approach – Although the differential‐based techniques for estimating optical flows have long been in wide use owing to the relative simplicity of their mathematical description, their applicability is known to be limited to the situations, when the optical flow has a relatively small norm. In order to extend such method to deal with large‐amplitude optical flows, it is proposed to model the optical flow as a composition of its time‐delayed version and a complementary optical flow. The former is used to predict the current optical flow and, subsequently, to warp forward the preceding image of the tracking sequence, while the latter accounts for the residual displacements that are estimated using Kalman filtering based on the “small norm” assumption. Findings – The study shows that taking into consideration the temporal coherence of optical flows results in considerable improvement in the quality of their estimation in the case when the amplitude of the optical flow is relatively large and, hence, the “small norm” assumption is not applicable. Research limitations/implications – In the present work, the algorithm is formulated under the assumption that the optical flow is affine. This assumption may be restrictive in practice. Consequently, an important direction to extend this work is to consider more general classes of optical flows. Originality/value – The main contribution of the present study is the use of multigrid methods and a projection scheme to relate the state equation to the apparent image motion.
Engineering Computations: International Journal for Computer-Aided Engineering and Software – Emerald Publishing
Published: Jul 1, 2006
Keywords: Flow; Motion; Estimation
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.