The objective of this tutorial is to explain to you what are the differences between OpenKinect and OpenNI in a first time. And then we’ll see how to install the OpenNI framework in order to run the first examples on your PC.Ia second part we will see how to use some tools such as Processing and Animata to test some possibilities offered by the Kinect.
- OpenKinect is an open source project built around the libfreenect driver which offering an APIs to control the motor, the LEDs, the cameras and the audio. It also provides a library to analyze the outputs of the Kinect in order to provide a layer abstraction such as the tracking of the hands or of the skeleton. Moreover several wrappers are available to allow you to develop in your preferred language (Java, C#, C++, Actionscript, Python, etc.).
- OpenNI (Open Natural Interaction) is an open source Framework (under LGPL), partially developed by PrimeSence which is one of the creators of the Kinect. It provides standard abstract programming interfaces (API) (in C++ and C #) which allow developers to write applications based on natural interactions. It’s not specific to the Kinect and that’s its main advantage because it can be plug to another audio or visual device sensors.
This scheme shows us the architecture of OpenNI. To make simple OpenNI allows you to connect the sensors that send data raw to middleware which will analyze them, will forward them to other middleware and/or will deal with them to finally sent high-level data to the application (like Push Detector, Wave Detector). This modular architecture allows you to interface different middleware or devices complying with the API defined by OpenNI. In our case NITE as middleware and the Kinect as device.
Now, I’m going to explain you how to install the various components of the OpenNI in order to run the Kinect on your PC.
To begin we are going to install OpenNI :
- Go here first and download the last OpenNI version (v18.104.22.168 at the time of writing).
- Once the download is finished, run the installer and follow the instructions.
Now we are going to install the Kinects drivers that can interface with the OpenNI’s API:
- To start, download the SensorKinect drivers here (v5.0.0) (If you don’t use a git client click on Downloads > Download.zip).
- Once the download is finished, extract the file and go to the Bin folder.
- Then run the executable and follow the instructions.
- To finish you need to install an additional driver which is located into the folder where you extracted the driver and go in the path Platform\Win32\Driver. If you are on a 32 bit system install the driver dpinst-x86.exe and if you are on a 64 bit sytem install the driver dpinst-amd64.exe.
Finally we are going to install the last component which allows OpenNI to run, namely NITE which is a middleware that contains all image processing algorithms and treatments of data raw from sensors:
- In the first place download the latest version of NITE here (v22.214.171.124 when I write these lines).
- Once the download is complete, run the instalation and follow the instructions. During the instalation it’ll ask you a key. Enter the following key: 0KOIk2JeIBYClPWVnMoRKn5cdY4= (NITE is not free but PrimeSense provided a free key to allow users to test its middleware).
All must be properly installed on your PC. To verify this you just have to connect the adapter on your Kinect then the on electric plug before connecting it to your PC. If all goes well, the green Led will begin flashing and Windows will begin installing the device.
(IR and RGB video)
Up to here, we saw how to install OpenNI, NITE and the SensorKinect drivers on Windows. Now we are going to take advantage of informations provided by OpenNI and by the Kinect to make skeleton tracking.
Before starting we will need of OSCeleton which is an open-source proxy . It collects the skeleton information returned by the Kinect (via the OpenNI interface) and then format them in a standard format before sending them via OSC (Open Sound Control). This simplifies the use of data to make them available in any programming language/application which supports the OSC protocol:
- To start download OSCeleton here (if you don’t have a git client click on Dowloads > Download.zip).
- Extract the zip file contents, browse into the folder and open OSCeleton.sln with Visual Studio C++ express 2010 (If you didn’t it, download it here).
- Once the project opened, generate the executable by pressing the F6 key or by clicking on Build Solution from the Build menu.
- After close Visual Studio, browse into the Debug folder which has just been created to check out if OSCeleton.exe has been generated.
- Finally verify that it was properly compiled by running the executable with the -h option to display all the available options: OSCeleton.exe –h
Now that OSCeleton is installed, we are going to download the examples to make some tests:
- First download the examples provided at this address: https://github.com/Sensebloom/OSCeleton-examples (Downloads > Download.zip).
- Once the download is completed, expand the archive.
These examples use two free open source software which are Animata and Processing. Animata is a software for real-time animation designed to create animations and interactive background that are similar to an environment of puppets. Processing is itself a development environment and programming language designed to easily create images, animations or interactions.
Let’s start by testing the example with Animata:
- If you don’t have it, download Animata here (version 004) and expand the zip file.
- Then run OSCeleton with these options: OSCeleton.exe -k -mx 640 -my 480 -ox -160. The –k option (« kitchen » mode) allows to format the messages sent by OSC to make them compatible with Animata.
- Later runs Animata and open the “doll_soft.nmt” example which is located in the animata folder.
- Take place in front of your Kinect, and put yourself in position calibration as shown above.
- When calibration is done, the animata should begin to replicate your movements.
(That’s what you should see on your screen)
In order to have a little more freedom, we are going to test the examples under Processing:
- If you don’t have Processing, download it here (1.2.1 version). Once downloaded and run it and close it. This will create a lot of configuration files and create a folder named Processing in your Documents.
- To run the examples you will need 2 libraries:
- OscP5 est une librairie qui implémente le protocole OSC. OscP5 is a library which implements the OSC protocol. If you don’t have it, download it using SVN here (If you don’t have a SVN client, download TortoiseSVN).
- pBox2d is a physics engine for 2-dimensional worlds. You can download it here (currently v0.03).
- To use these libraries in Processing copy them into the Processing>libraries folder in your documents.
- Run OSCeleton without argument.
- Restart Processing if it was open, otherwise run it.
- Select Open in the File menu and select the MotionCapture3D.pde sample in the example folder that we have previously downloaded.
- Run the example by going to the Sketch>Run menu.
- Put yourself in front of the Kinect and wait until the calibration is done. Then you must see your skeleton in the Processing window as shown above.
There is also an OpenNIs wrapper for Processing which allows you to access more features such as the depthMap, the handtracking, etc… This library is called Simple-OpenNI-and you can download it here. Install it in the same way as OscP5 and pBox2d by expanding the archive into the Processing>libraries folder in your Documents.
Well, now you have all the basis to start the development of prototypes using Kinect while pending the official SDK from Microsoft release.
Tutorial from: http://yannickloriot.com/2011/03/kinect-how-to-install-and-use-openni-on-windows-part-1/