An efficient reduction of IMU drift for registration error free augmented reality maintenance application
Date issued
2015
Journal Title
Journal ISSN
Volume Title
Publisher
Václav Skala - UNION Agency
Abstract
Augmented reality (AR) is a technology that overlays virtual 3D content in the real world to enhance a user’s
perception. This AR virtual content must be registered properly with less jitter, drift or lag to create a more
immersive feeling for the user. The object pose can be determined using different pose estimation techniques using
the data from sensors cameras and inertial measurement units (IMUs). Camera based vision algorithms detect the
features in a given environment to calculate the relative pose of an object with respect to the camera. However,
these algorithms often take a longer time to calculate the pose and can only operate at lower rates. On the other
hand, an IMU can provide fast data rates from which an absolute pose can be determined with fewer calculations.
This pose is usually subjected to drift which leads to registration errors. The IMU drift can be substantially
reduced by fusing periodic pose updates from a vision algorithm. This work investigates various factors that affect
the rendering registration error and to find the trade-off between the vision algorithm pose update rate and the
IMU drift to efficiently reduce this registration error. The experimental evaluation details the impact of IMU drift
with different vision algorithm pose update rates. The results show that the careful selection of vision algorithm
pose updates not only reduces IMU drift but also reduces the registration error. Furthermore, this reduces the
computation required for processing the vision algorithm.
Description
Subject(s)
rozšířená realita, odhad pozice, inerciální měřící jednotka, sledování značky, senzorová fúze, chyba při registraci
Citation
WSCG 2015: full papers proceedings: 23rd International Conference in Central Europeon Computer Graphics, Visualization and Computer Visionin co-operation with EUROGRAPHICS Association, p. 211-218.