Kinect related projects that involve Processing and Arduino are always so damn cool! Take for instance this latest demo by P&A LAB. They sought out to develop a tracking system that would control a light source by either turning it on/off or dimming the light. They've set up their rig to include some LED lights that represent the amount of dimming that would occur if this product were actually in use. The video shows a linear actuator move according to the gestures used in order to control the amount of dimming the light source would receive.
The OpenNI organization is an industry-led, not-for-profit organization formed to certify and promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware.
As a first step towards this goal, the organization has made available an open source framework – the OpenNI framework – which provides an application programming interface (API) for writing applications utilizing natural interaction. This API covers communication with both low level devices (e.g. vision and audio sensors), as well as high-level middleware solutions (e.g. for visual tracking using computer vision).
Official Site: openni.org
Fresh out the gate and right on time for all your holiday hacking needs comes the new OpenNI 2.0 SDK (along with a shinny new website to boot!). The Good folks at OpenNI have rebuild their widely used SDK from the ground up with several new enhancements pertaining to the way you build apps using 3D sensors.
Here's a list of using OpenNI 2.0 over the previous version:
The Kinect has this unique ability to really "wow" people while simultaneously creeping them the hell out. Take for example this hack put together by some animatronic wizards over at Disney for their theme park. The goal - juggle with a robot. That's right folks, soon you'll be able to toss a ball to your favourite Disney character and have it sent back before you send the next one over.
No, this isn't an announcement for Vi Insanely Improved (wow, a vim joke, never thought I'd see the day…) but rather an announcement about a new Kinect related SDK to hit the market.
ViiM provides all functionalities of OpenNI, adding new high-level features and is packed with an assortment of great tools including 14 recognizable gestures ranging from basic wave to double click or backspace. ViiM is also able to track up to 15 skeletons while calculating individual joint positions, rotation angles, matrixes and quaternions.
AlexD hooked us all up with this robust and incredibly awesome framework developed as a part of his Computer Science Master Thesis. The framework allows users to become immersed in a whole body virtual reality experience. It's something you definitely need to see to believe. But we'll get to that soon enough.
ARCADE was designed to give live augmented reality presentation over video. The software generates 3D content in space around the presenter allowing him/her to use natural gestures to convey information to their audience. No post processing is involved which makes this the perfect solution for live presentations.
You learn something new everyday right? If not, you gotta get out more! Appearently, Phantom Limb Pain is a condition that effects people who have lost a limb and still experience a fair amount of pain originating from the missing limb. What this Kinect related projects aims to do is alleviate some of that pain by immersing the patient in a VR environment where the limb is present again. Must be some sort of Psychosomatic type of response from the brain which believes the limb is in tact. Very interesting concept to say the least.
This is what I like to see. A crisp, clean, seamless user interface with accompanying code for us all to try out! If you skip past my boring ass synopsis and go straight to the video, you'll be treated to one of the slickest video manipulation programs using a natural user interface to date. You heard right folks, this demo is a thing of beauty.
The other day I posted an article about a great little game developed by Paul & Syd. They built this easy to use and fun to play Dance competition game that matches up your skeletal positioning with data captured from the Kinect and then pumps it into a browser that supports HTML5 using Jens Alexander Ewald's WebSocketP5 library.
The title pretty much says it all folks. A sort of high tech "Simon Says" in the form of a dance off with the help of a Kinect sensor and a little HTML5 Know-how. I'm really impressed by the polish and responsiveness of this HTML5 Kinect Dance game. I'm a big fan of HTML5 and love what you're able to do with it - especially the canvas tag. There are a lot of great tutorials out there if you're interested in learning more about it. If you're a web developer, it's definitely something to think about.
- 1 of 2