Midi controller with Kinect V2

June, 2016

Description, Technical means, Video

Description

The main idea of this project was to create a bridge between sound and space, how can we interact with those 2 elements.
The solution here is to digitally organize sound sample around the user. If the user wants to play a sample, he can move his hand in that area, or this one. If he wants to control a filter, or intensity or anything, he can digitally adjust a virtual knob.
With this, I have created a midi controller using a kinect V2. The kinect detects people gesture, transmit them to the software, and those movement gesture will be processed into vrtual midi information, that you can use like any physical midi controller. Just connect it to Ableton for instance, and voila.

Technical means

Here are the technical elements I use to create my virtual midi controller with the Kinect:

The Kinect gathers body movements, gives those informations to the homemade software written with Processing. The Processing shetch has 2 purposes: one is to display virtual button in space, and the body skeleton that is tracked by the Kinect, and the other purpose is to recognize the body position if it is inside a button, and send correspoonding midi signals to the virtual midi port (Virtual Midi).
The virtual midi software is seen like a real midi object, and so sends midi command to any other applications. So in the Ableton Software, I add this virtual midi as a simple midi controller, and listen to midi signals came from Virtal Midi. With this techinques, I could play any sound recorded in sample, in Ableton.
I use the KinectPV2 library written by Thomas Sanchez Lengeling, for the Kinect, and the MidiBus library for midi communications. Those library were useful for the processing code.

Video

Here is the result video of the project: