Access the full text.
Sign up today, get DeepDyve free for 14 days.
Real-time camera calibration has been intensively studied in augmented reality. However, for texture-less and texture-repeated scenes as well as poorly illuminated scenes, obtaining a stable calibration is still an open problem. In the paper, we propose a method of calibrating a live video by tracking orthogonal vanishing points. Since vanishing points cannot be obtained directly on the image, the tracking is achieved by tracking parallel lines. This is a changeling problem due to the fact that vanishing points are sensitive to image noise, camera movement, and illumination variation. We tackle the challenges by three optimization procedures and flexible process of degenerated cases. During three optimizations, several explicitly geometric constraints are incorporated, ensuring the calibration result robust to poor illumination and camera movement. A variety of challenging examples demonstrate that the proposed algorithm outperforms state-of-the-art methods for texture-less and texture-repeated scenes.
The Visual Computer – Springer Journals
Published: May 8, 2018
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.