top of page

Sonic Motion Project

This project involves using a motion detection camera to track the performer's movements and gestures which then trigger various sounds and effects. Essentially the performer becomes the instrument. Using projection mapping, visuals that react to the performers location and the sounds produced will also fill the room creating an immersive experience for both performer and viewer.

Early Days

Below is a video of my first experiment with gesture controlled sound. Using a software program called Kinectar, I was able to map out the X, Y and Z axis of each hand and assign an effect to each variable. These movements were picked up by the Kinect camera and a program called Synapse that creates a skeleton of the performer. The midi data from Kinectar is then sent to Ableton Live where I was able to assign the sounds and effects.  

 

The visual aspect of this project is a secondary goal for me and therefore I have relied heavily on the programming skills of others in order to add this element to the project. After scouring the internet for sources, I discovered this Quartz composer patch, made by M. Oostrik http://kineme.net/Composition/MOostrik/AugmentedRealityinOpenCL which seemed to be the perfect match my needs. The benefit of this particular patch being that it uses the camera built in to the iMac as opposed to the Kinect camera meaning that I can run the two programs simultaneously without worrying about any increased lag on the the Kinect camera or any potential conflicts that could arise from using the same camera for different programs.

Below is a video of the visuals in action (additional visual parameters can be utilised, but for now I will stick to the partical effect).

Visuals

I discovered that it was not possible to run to visual inputs simultaneously through one computer (i.e. webcam and Kinect), so I have configured the two programs (Synapse and Quartz Composer) to feed off of the Kinect camera. This seems to work rather well although, as is evident in the video below, there is a significant reduction in frame rate (however, part of this reduction is due to the increased CPU needed for screen recording and should therefore not be so much of an issue when it comes to the installation). I combatted this latency as best I could by reducing the amount of particles per frame.

This instrument will be the first element of the installation. I feel its simplicity will being a good introduction for the user to learn how to manipulate the sound.

The video below shows the second element of this project which involves manipulating pre-existing audio. I also incorporated another visual effect into the mix, which should have a much better frame rate when performing live. A low pass filter is mapped to the y axis (up/down) of my torso and a beat repaeat/glitch effect to the z axis (forward/backward). On my left hand I added a phaser and on my right a saturation effect. I feel that the effects could be more subtle, but as well it's important that the user realises how they are effecting the music, slower movements seem to work better and also create smoother visuals.

First Instrument

Sound Manipulation

After successfully trialling the visuals and instruments on location a few days previous to the actual installation, I was confident that the combination of visual and audio effects worked well and was ready to let people interact with it. A last minute venue change, due to a malfunctioning HDMI cable, meant that the mapping I had done for the Kinect camera and projection was slightly off, but I did my best to reconfigure everything. I had the help of a technical support assistant in setting up the new venue who helped make the room safe from a health & safety point of view by removing any trips hazards such as the Kinect camera cables. The video below shows my fellow students interacting with the installation.

 

Each hand is controlling an evolving pad; the X, Y and Z axis positioning of those hands affects several different variables such as pitch, reverb, white noise and distortion along with several effects built into the sound design software used, which was Spectrasonics' Omnisphere. The audio created from those pads is also subjected to rhythmic processing, a process developed by sound designer Diego Stocco, which further affects the sound in a variety of ways depending on factors such as the frequency of the signal received and the dynamic range of the signal. I also mapped a low pass filter to the Y axis torso so that ducking down swept away the high end frequencies.

The Installation

bottom of page