A community based site focusing on development and programming for the Microsoft Kinect 3D depth sensor.
Error message
Deprecated function: Array and string offset access syntax with curly braces is deprecated in include_once() (line 20 of /netapp/www/home/devkin/domains/developkinect.com/public_html/includes/file.phar.inc).
If you haven't heard or seen anything relating to Chris Vik's work, now's a good time to start catching up. His YouTube page should give just the right amount of background info and be warned, your mind may be blown in the process. Go here - http://www.youtube.com/user/synaecide - run, don't walk.
Uses Kinect to control which beat is playing in a looping sample. A sample is chopped up into eighth note slices (or any increment) and arranged like a clock face around you. The currently playing slice is controlled by your left hand's position in space. This allows you to remix a loop by waving your hand around. The red dot indicates your hand's position, the green bar indicates the currently playing slice, and the blue bar moves at the tempo of the song to serve as a guide.
Ryan Challinor, creator of Synapse for Kinect, has developed another great little side project in his free time. This time around he's created a Kinect controlled BeatWheel that controls a looping sample by using gestures to play the track normally or in reverse. You'll need to download the Max patch and the Synapse for Kinect application for Mac in order to try it out for your self.
The PuShy project is a very creative and inspiring project involving Max/MSP, OSC along with a trusty Kinect sensor. The idea behind this Tactile Sonic Interface is to provide a means of touching something that has no business making any sort of sonic response and then coming to life with a symphony of music once the interaction has taken place. This project was first unraveled at the Plektrum Festival 2011, Tallinn, Estonia.
Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. In a way, this allows you to use your whole body as an instrument.