Error message

Deprecated function: Array and string offset access syntax with curly braces is deprecated in include_once() (line 20 of /netapp/www/home/devkin/domains/

Group List

A very cool hackathon event is going down on October 5th and 6th at the High Tech Campus (5656 Gestel, Eindhoven, The Netherlands). Participants will pit themselves against one-another by developing a toolkit for 3D design and Print using the Wii remote, Microsoft Kinect, Leapmotion or any other innovative device, not being a traditional mouse, in 32 hours.

With the fast growth of the 3D Printing industry, it is becoming clear that the weakest chain in the value chain to get a 3D product printed, are the design tools and the easy conversion to a 3D printing format.

There are many design tools and related hardware solutions available, but most are related to traditional input methods like the mouse etc.
With the advent of the Wii, Micosoft Kinect and Leapmotion, new input methods will become available for designing 3D objects.

If you're interested in signing up or learning more, be sure to visit the page at Also, keep in mind that the RSVP deadline Oct 3, 2012 9:00 AM.

Ableton Suite is a complete software studio. Suite 8 gives you all of the features in Live 8 plus sound, with a radically new Library packed with beautiful new sounds and a wealth of useful resources. Suite 8 contains 11 Ableton instruments and effects including synths, a sampler, electric and acoustic drums, mallets, numerous sampled instruments, the new, reworked Operator and amp modeling effects. Two completely new instruments, Collision and Latin Percussion, round off the set. Ableton Suite 8 is a complete package: the tools and the sounds.

This site was primarily designed as a resource for people interested in developing applications using Microsoft's Kinect sensor. There are tons of resources available online of course, the only problem is that they're scattered all over the place. If you're interested in a particular subject, let's say, developing a Kinect app using Processing, you'll more than likely find everything you'll need in the Processing forums. What this site intends on providing is a means for those interested in more than just one particular programming language or tool in order to build their dream project using this incredibly affordable 3D depth sensing camera.

What Develop Kinect will provide you, the visitor, is a wide variety of resources and useful information in order to develop applications for the Kinect at a veritable glance. All visitors can browse multiple active groups that follow particular subjects, while members can subscribe to said groups and receive notifications when anything new is posted to that group. This is a great way to stay on top of all of the Kinect related subject matter you are interested in. You'll be able to communicate with other developers or enthusiast that subscribe to the same groups and discuss subject matter that relates to your particular interests. This format provides the perfect opportunity for you to stumble upon a new SDK or Motion Capture solution -- whatever the case may be -- that does exactly what you were looking for, purely by coincidence.

You're also able to search and filter through wide variety of available resources. Everything you'll need to get your project under way will be right here saving you from browsing one svn repo to a forum post to the official site and then back again. People who may be interested a particular subject but are just getting starting out will find Develop Kinect especially useful. The majority of the guides submitted to the site relate to the first few steps it will take to get things started on your Operating System and programming tool or language of choice. All downloadable resources will be available internally so you won't need to bounce around all over the internet to get what you're looking for. A glossary is also available if you're looking for a definition of a term that's being used frequently on the site. Technical jargon can turn most beginners off, especially early on in the development phase. Develop Kinect aims to remedy that and provide many more features such as personal developer blogs, news articles and more!

Honestly, this site is for anyone interested in the amazing the Kinect camera is capable of and how it will change the way we interact will digital devices in the future. If you're just interested in ready the latest news or plotting the next game changer with colleagues who have similar ideas and interests, Develop Kinect is where it all happens. If you have any suggestions what-so-ever, I'd love to hear them. Don't be afraid to send me an email via our contact form. I'd love to hear your thoughts, ideas and concerns. Thanks for visiting and I hope you enjoy your time. Now go on, change the world with that next Kinect inspired idea!

So today was the day. I broke down and picked up a dedicated Windows laptop. After abandoning all things WIndows about 3 years ago -- jamming Ubuntu for a year and a half and then my MacBook Pro until present -- I had two reasons which made me bite the proverbial bullet and make that sad stroll on over to FutureShop.

First off, the Kinect SDK demo's and everything else for that matter, do not work under Parallels for my Mac. I assume it has something to do with the bandwidth allocated to the USB ports. My guess is that Parallels emulates a lot of the devices and pumps the data over the USB bus. Don't quote me on that....just a guess. When running Evoluce in Parallels, the IR point cloud wouldn't initiate either so I just chalked it up to emulating Windows on a Mac.

Secondly, I'm writing a book and my publisher requires that I use a specific MS Word template for the series of books mine will be released under. I had a ton of work done in XML using Author -- great XML app I might add. Anyway, this template doesn't play nice with Word for Mac, Pages or OpenOffice so I was left with little choice. As I'm writing this, the blue light representing a charged battery appeared on my shiny new Gateway PC. The price point was alright for what I was getting. WIndows 7 Home w/Intel i5, 8 GB of RAM, Nvidia GeForce GT 520M, 1 GB VRAM, 750 GB HDD for around $650. Definitely overkill for what I need it for, but I think I'll be alright if I ever wanted to check out StarCraft 2 or Diablo 3. It has HDMI out as well so it'll serve as a NetFlix machine durring its downtime.

Oh yeah, the point of this post was to let you guys know that I'll finally be able to properly check out Evoluce's SDK, a direct competitor to the Official Microsoft Kinect SDK. Actually, I'll be able to check out both of these SDK's and give accurate write ups, guides and other helpful resources relating to them respectively.

This is a must have for all you future loving Kinect-o-phile music fans out there. Amulet Voice Kinect allows you to control your entire music catalogue by using Microsoft's Speech ZiraPro library along with a Kinect sensor.

If you're fortunate enough to be running a Windows 7 machine along with a decent sized library of music, this application will definitely be the centre piece at your next get together. It's free to download and needs just a few requirements to be met in order to get things up and running.

Go to the project page over at to learn more about the voice and gesture commands along with a full descriptive tutorial on how to get everything set up.

Check out the video below to see Amulet Voice Kinect action!

ARCADE was designed to give live augmented reality presentation over video. The software generates 3D content in space around the presenter allowing him/her to use natural gestures to convey information to their audience. No post processing is involved which makes this the perfect solution for live presentations.

The presenteror views the live video of himself as if in a mirror. Multiple 3D objects are available to the presenter that can be used during the presentation. OpenNI is used for skeletal tracking. The developers have created their own "Grammar of Gestures" and several demo applications to show of the presentation software.

Features include:

  • One-handed pick & place - When a hand or finger hovers over on object, it becomes selected and can be controlled. When that object touches another object, they become bonded together.
  • Menu Selection & Swiping - By using your finger, you can easily browse through menus and select items by swiping. Automatic highlighting occurs while moving up or down scrolls through the menu.
  • 3D Drawing - The presenter can make 3D drawings by touching fingers together. The first finger to move away is omitted and the remaining finger controls the drawing. Once completed, a single key press can rotate the drawing in 3D space to show different perspectives.
  • Rotation and Uniform Scale - The presenter can manipulate objects arbitrarily. One can rotate the object on the vertical and horizontal axis by moving his finger either up/down or left/right. Scaling can be achieved by manipulating the object using two fingers within the bounding box to increase/decrease the depth or Z-axis of the object.
  • Delete Gesture - the presenter can delete or erase any object or drawing by waving his hand/
  • This is by far the most robust and feature rich augmented reality interface to come out for the Kinect. Check out the video to see exactly what I'm talking about.

    NOTE - The video currently restricts embedding but you can still view it on Vimeo by following this link

Kinect related projects that involve Processing and Arduino are always so damn cool! Take for instance this latest demo by P&A LAB. They sought out to develop a tracking system that would control a light source by either turning it on/off or dimming the light. They've set up their rig to include some LED lights that represent the amount of dimming that would occur if this product were actually in use. The video shows a linear actuator move according to the gestures used in order to control the amount of dimming the light source would receive.

Some very interesting work is at play here using open source tools such as SimpleOpenNI for Processing. Check out the project page for a bit more detail HERE and watch the video below to see what the heck it is I'm talking about!

In an article posted on today, it would appear as though the Kinect sensor may very well be propelling itself from our living room into outer space. Astronauts apparently lose a fair amount of body mass due to muscle atrophy -- approximately15 percent actually -- seeing as how they're in a zero gravity environment, traditional scales don't exactly work all that well.

Some good folks over at the Italian Institute of Technology's Center for Human Space Robotics are using the Kinect's depth sensing capabilities to create a 3D model of the subject which is then compared to a database of 28,000 body types. The estimates are apparently 97% accurate with an error difference of approximately 2.7 kilograms.

The system isn't being tested out just yet in outer space but it seems as though it could be a feasible option down the road. Especially since the Kinect is so small that it can be integrated into the walls of the shuttle. There is currently a system in place that's relatively big and bulky which uses springs to calculate the mass of the astronauts.