Harmonic Motion is a planned toolkit for musicians and artists to simplify accessing data from sensors such as the Kinect. It is being created at the Centre for Digital Music to provide a simple pathway to create advanced interaction designs that feel natural and intuitive.
Modular building blocks will be combined with a simple but powerful interface to chain these together to create intuitive musical interpretations of sensor data. These processing pipelines can produce OSC or MIDI data as well as being saved to a file for direct use in creative coding platforms such as openFrameworks.
Getting meaningful data from gestures often involves a lot of fiddling around with magic numbers, hard work which often does not make it outside of a specific project. We hope to provide an open platform that allows this work to be encapsulated, reused and potentially shared.
Connecting movement to sound in a manner that feels simple and natural can be surprisingly complex to achieve. Our aim is to to help users go beyond basic height→pitch style mappings by providing self-contained building blocks that can be chained together.