Fast and Stable Tracking for AR fusing Video and Inertial Sensor Data
Date issued
2006
Journal Title
Journal ISSN
Volume Title
Publisher
Václav Skala - UNION Agency
Abstract
Accurate acquisition of camera position and orientation is crucial for realistic augmentations of camera images. Computer
vision based tracking algorithms, using the camera itself as sensor, are known to be very accurate but also time-consuming. The
integration of inertial sensor data provides a camera pose update at 100 Hz and therefore stability and robustness against rapid
motion and occlusion. Using inertial measurements we obtain a precise real time augmentation with reduced camera sample
rate, which makes it usable for mobile AR and See-Through applications.
This paper presents a flexible run-time system, that benefits from sensor fusion using Kalman filtering for pose estimation. The
camera as main sensor is aided by an inertial measurement unit (IMU). The system presented here provides an autonomous initialisation
as well as a predictive tracking procedure and switches between both after successfull (re)-initialisation and tracking
failure respectively. The computer vision part performs 3D model-based tracking of natural features using different approaches
for yielding both, high accuracy and robustness. Results on real and synthetic sequences show how inertial measurements
improve the tracking.
Description
Subject(s)
rozšířená realita, senzorická fúze, počítačové vidění
Citation
WSCG '2006: Short Papers Proceedings: The 14-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2006: University of West Bohemia, Plzen, Czech Republic, January 31 - February 2, 2006, p. 109-116.