Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Auditory Coding of Human Movement Kinematics

Auditory Coding of Human Movement Kinematics Although visual perception is dominant on motor perception, control and learning, auditory information can enhance and modulate perceptual as well as motor processes in a multifaceted manner. During last decades new methods of auditory augmentation had been developed with movement sonification as one of the most recent approaches expanding auditory movement information also to usually mute phases of movement. Despite general evidence on the effectiveness of movement sonification in different fields of applied research there is nearly no empirical proof on how sonification of gross motor human movement should be configured to achieve information rich sound sequences. Such lack of empirical proof is given for (a) the selection of suitable movement features as well as for (b) effective kinetic–acoustical mapping patterns and for (c) the number of regarded dimensions of sonification. In this study we explore the informational content of artificial acoustical kinematics in terms of a kinematic movement sonification using an intermodal discrimination paradigm. In a repeated measure design we analysed discrimination rates of six everyday upper limb actions to evaluate the effectiveness of seven different kinds of kinematic–acoustical mappings as well as short term learning effects. The kinematics of the upper limb actions were calculated based on inertial motion sensor data and transformed into seven different sonifications. Sound sequences were randomly presented to participants and discrimination rates as well as confidence of choice were analysed. Data indicate an instantaneous comprehensibility of the artificial movement acoustics as well as short term learning effects. No differences between different dimensional encodings became evident thus indicating a high efficiency for intermodal pattern discrimination for the acoustically coded velocity distribution of the actions. Taken together movement information related to continuous kinematic parameters can be transformed into the auditory domain. Additionally, pattern based action discrimination is obviously not restricted to the visual modality. Artificial acoustical kinematics might be used to supplement and/or substitute visual motion perception in sports and motor rehabilitation. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Multisensory Research (continuation of Seeing & Perceiving from 2013) Brill

Loading next page...
 
/lp/brill/auditory-coding-of-human-movement-kinematics-gzU22ZBkJH

References (39)

Publisher
Brill
Copyright
© Koninklijke Brill NV, Leiden, The Netherlands
Subject
Articles
ISSN
2213-4794
eISSN
2213-4808
DOI
10.1163/22134808-00002435
Publisher site
See Article on Publisher Site

Abstract

Although visual perception is dominant on motor perception, control and learning, auditory information can enhance and modulate perceptual as well as motor processes in a multifaceted manner. During last decades new methods of auditory augmentation had been developed with movement sonification as one of the most recent approaches expanding auditory movement information also to usually mute phases of movement. Despite general evidence on the effectiveness of movement sonification in different fields of applied research there is nearly no empirical proof on how sonification of gross motor human movement should be configured to achieve information rich sound sequences. Such lack of empirical proof is given for (a) the selection of suitable movement features as well as for (b) effective kinetic–acoustical mapping patterns and for (c) the number of regarded dimensions of sonification. In this study we explore the informational content of artificial acoustical kinematics in terms of a kinematic movement sonification using an intermodal discrimination paradigm. In a repeated measure design we analysed discrimination rates of six everyday upper limb actions to evaluate the effectiveness of seven different kinds of kinematic–acoustical mappings as well as short term learning effects. The kinematics of the upper limb actions were calculated based on inertial motion sensor data and transformed into seven different sonifications. Sound sequences were randomly presented to participants and discrimination rates as well as confidence of choice were analysed. Data indicate an instantaneous comprehensibility of the artificial movement acoustics as well as short term learning effects. No differences between different dimensional encodings became evident thus indicating a high efficiency for intermodal pattern discrimination for the acoustically coded velocity distribution of the actions. Taken together movement information related to continuous kinematic parameters can be transformed into the auditory domain. Additionally, pattern based action discrimination is obviously not restricted to the visual modality. Artificial acoustical kinematics might be used to supplement and/or substitute visual motion perception in sports and motor rehabilitation.

Journal

Multisensory Research (continuation of Seeing & Perceiving from 2013)Brill

Published: Jan 1, 2013

Keywords: Acoustical kinematics; auditory movement information; kinematic–acoustical mapping; motor learning; movement sonification; multisensory integration; perceptual learning

There are no references for this article.