Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Consistent categorization of multimodal integration patterns during human–computer interaction

Consistent categorization of multimodal integration patterns during human–computer interaction Multimodal interaction represents a more natural style of human-computer interaction permitting our developed communicative skills to interact with computer systems. It remains a challenging task to design reliable multimodal systems. Employing advanced methods providing optimal performance depends on precise modeling of integration patterns that allows adapting to preferences and differences of individual users. While basic foundation and empirical evidence around these differences has already been described and confirmed in previous research works, introduced measures and classifications seem oversimplified and insufficiently precise to design reliable and robust interaction models. In this paper, results of our study of multimodal integration patterns in systems combining speech and gesture input are presented. Important interaction differences of subjects and their specific multimodal integration patterns were confirmed and completed with our own findings. Based on the obtained results, a new integration pattern categorization is defined and analyzed. The introduced categorization provides more reliable and consistent results in comparison with classifications presented in related literature. Moreover, its generality means it is applicable on other input modality combinations. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal on Multimodal User Interfaces Springer Journals

Consistent categorization of multimodal integration patterns during human–computer interaction

Loading next page...
 
/lp/springer_journal/consistent-categorization-of-multimodal-integration-patterns-during-tSp9zdChI3

References (9)

Publisher
Springer Journals
Copyright
Copyright © 2017 by SIP
Subject
Computer Science; User Interfaces and Human Computer Interaction; Signal,Image and Speech Processing; Image Processing and Computer Vision
ISSN
1783-7677
eISSN
1783-8738
DOI
10.1007/s12193-017-0243-1
Publisher site
See Article on Publisher Site

Abstract

Multimodal interaction represents a more natural style of human-computer interaction permitting our developed communicative skills to interact with computer systems. It remains a challenging task to design reliable multimodal systems. Employing advanced methods providing optimal performance depends on precise modeling of integration patterns that allows adapting to preferences and differences of individual users. While basic foundation and empirical evidence around these differences has already been described and confirmed in previous research works, introduced measures and classifications seem oversimplified and insufficiently precise to design reliable and robust interaction models. In this paper, results of our study of multimodal integration patterns in systems combining speech and gesture input are presented. Important interaction differences of subjects and their specific multimodal integration patterns were confirmed and completed with our own findings. Based on the obtained results, a new integration pattern categorization is defined and analyzed. The introduced categorization provides more reliable and consistent results in comparison with classifications presented in related literature. Moreover, its generality means it is applicable on other input modality combinations.

Journal

Journal on Multimodal User InterfacesSpringer Journals

Published: Mar 22, 2017

There are no references for this article.