If motion sounds: Movement sonification based on inertial sensor data

Heike Brock, Gerd Schmitz, Jan Baumann, and Alfred O. Effenberg
In proceedings of 9th Conference of the International Sports Engineering Association (ISEA), Elsevier, Jan. 2012
 

Abstract

Within last years, movement sonification turned out to be an appropriate support for motor perception and motor control that can display physical motion in a very rich and direct way. But how should movement sonification be configured to support motor learning? The appropriate selection of movement parameters and their transformation into characteristic motion features is essential for an auditory display to become effective. In this paper, we introduce a real-time sonification framework for all common MIDI environments based on acceleration and orientation data from inertial sensors. Fundamental processing steps to transform motion information into meaningful sound will be discussed. The proposed framework of inertial motion capturing, kinematic parameter selection and possible kinematic acoustic mapping provides a basis for mobile real-time movement sonification which is a prospective powerful training tool for rehabilitation and sports and offers a broad variety of application possibilities.

Images

Bibtex

@INPROCEEDINGS{sonify2012,
     author = {Brock, Heike and Schmitz, Gerd and Baumann, Jan and Effenberg, Alfred O.},
      title = {If motion sounds: Movement sonification based on inertial sensor data},
  booktitle = {9th Conference of the International Sports Engineering Association (ISEA)},
       year = {2012},
      month = jan,
  publisher = {Elsevier},
   abstract = {Within last years, movement sonification turned out to be an appropriate support for motor
               perception and motor
               control that can display physical motion in a very rich and direct way. But how should movement
               sonification be
               configured to support motor learning? The appropriate selection of movement parameters and their
               transformation
               into characteristic motion features is essential for an auditory display to become effective. In
               this paper, we introduce
               a real-time sonification framework for all common MIDI environments based on acceleration and
               orientation data
               from inertial sensors. Fundamental processing steps to transform motion information into meaningful
               sound will be
               discussed. The proposed framework of inertial motion capturing, kinematic parameter selection and
               possible
               kinematic acoustic mapping provides a basis for mobile real-time movement sonification which is a
               prospective
               powerful training tool for rehabilitation and sports and offers a broad variety of application
               possibilities.}
}