A toolkit for gestural sound

Harmonic Motion is a planned toolkit for musicians and artists to simplify accessing data from sensors such as the Kinect. It is being created at the Centre for Digital Music to provide a simple pathway to create advanced interaction designs that feel natural and intuitive.

Keep me updated »

Modular

Modular building blocks will be combined with a simple but powerful interface to chain these together to create intuitive musical interpretations of sensor data. These processing pipelines can produce OSC or MIDI data as well as being saved to a file for direct use in creative coding platforms such as openFrameworks.

Pluggable

Getting meaningful data from gestures often involves a lot of fiddling around with magic numbers, hard work which often does not make it outside of a specific project. We hope to provide an open platform that allows this work to be encapsulated, reused and potentially shared.

Intuitive

Connecting movement to sound in a manner that feels simple and natural can be surprisingly complex to achieve. Our aim is to to help users go beyond basic height→pitch style mappings by providing self-contained building blocks that can be chained together.


Email updates

Please provide us with your name and email if you would like to receive news about the release of the toolkit. Emails will be infrequent and only about this project.

Your name or email address will be kept securely.

* indicates required

Authors

Harmonic Motion is being created by Tim Murray-Browne and Mark D. Plumbley at the Centre for Digital Music, Queen Mary University of London. The project is being funded by the UK Engineering and Physical Sciences Research Council (EPSRC) under the Platform Grant (EP/K009559/1) and is based on work completed during Murray-Browne's PhD at the Centre for Digital Music which was funded under an EPSRC Doctoral Training Account (EP/P504031/1).