A Multimodal User Interface Component for an Augmented Reality Mobile User Guidance System
Files
Date issued
2006
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Václav Skala - UNION Agency
Abstract
In general, user interfaces should be intuitive, self-explanatory and adaptive to various user skills. Especially in augmented reality systems with complex interaction possibilities simple concepts are necessary to guide the user. Therefore, we extend the ideas of the traditional WIMP metaphor (Windows, Icons, Menus, Pointer) with the capabilities of multimodal interfaces where multiple ”human” communication channels are used as input data for navigation, orientation and interaction. In this work we will present the adaption of the user interface of an existing mobile augmented reality system for cultural heritage to multimodal interaction. We present the concepts that led us to the decision for the use of speech and capturing of hand movements by means of an inertial tracker, as well as the implementation aspects of the initial prototype. User evaluation trials will prove our approach.
Description
Subject(s)
rozšířená realita, míchaná realita, grafická uživatelská rozhraní, multimodální uživatelská rozhraní, interakce člověk počítač, ergonomie softwaru, kulturní dědictví
Citation
WSCG' 2006: Posters proceedings: The 14-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2006 in co-operation with EUROGRAPHICS: University of West Bohemia, Plzen, Czech Republic, January 31 – February 2, 2006, p. 37-38.