SlideShare a Scribd company logo
1 of 14
Skeletal Tracking and Facial Tracking andAnimation
               Project Documentation
                      Vipul Divyanshu
                        IIL/2012/14
                      Summer Internship
                      Mentor: Imagineer
                      India Innovation Labs
Tasks at hand:
*Skeletal Tracking

*Facial Tracking

*Skeletal animation and integration with the model(additional)

Tools Explored:

       OpenNI
       Microsoft Kinect SDK
       Ogre 3D
       Unity
       Blender

Analysis of the tools and what was explored:

OpenNI:-OpenNI (Open Natural Interaction) is a multi-language, cross-platform
framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI
APIs are composed of a set of interfaces for writing NI applications. The main purpose of
OpenNI is to form a standard API that enables communication with both:
    Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their
    surroundings.)
    Vision and audio perception middleware (the software components that analyse the audio
    and visual data that is recorded from the scene, and comprehend it). For example,
    software that receives visual data, such as an image, returns the location of the palm of a
    hand detected within the image.
OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to
be implemented by the middleware components.
A major plus point of OpenNI is that we can run the skeletal tracking work and work on the
recorded Kinect video in .oni format.

Constrains of OpenNI:-

       What I found out was for an easier start of project Microsoft Kinect SDK would be
       more useful for a full body skeletal tracking and its animation C#. This is because
       OpenNI cannot easily load a model in OpenNI directly from Maya or Blender, instead
       SDKs like Ogre 3D must be used.

Microsoft Kinect SDK: -Microsoft Kinect come with classes for skeletal tracking
and it is much easier to initialise Kinect and get the skeletal joint points just by calling the
skeletal.joint in C#. For developing in C++ both Microsoft Kinect and OpenNI stand at same
level.

I developed in C# and found XNA and Microsoft Kinect very useful in both getting the
skeletal data and interacting with the 3D model in .fbx format.

Still the exploration of Kinect SDK in the facial animation field is left as it wasn’t a part my
work package.

FBX :-Fbx format used with C# is very flexible and is has no problem with object collision.

Constrains:-

       Working directly with the recorded data from the Kinect Studio .xed hasn’t been well
       explored due to lack of material online and that can be restricting.
       For facial tracking, it cannot be detected if the eyes are open or not, rest other features
       are detected properly.

Ogre 3D: -Ogre can be used in animation and in sync in the OpenNI or Kinect
SDK .It wasn’t explored much by me. It has a vast amount of potential for facial
animation.

Constrains:-Requires a bit of detailed study about the model and the class hierarchy to
perform the animation. It is easier for experienced person.

Things Learnt and coded:
       OpenNI: I learnt basic of extracting the skeletal joints and how to work
       with the recorded data and the class hierarchy of the OpenNI and there
       attributes, and understood the data flow. Wrote and analysed the code
       for the getting the skeletal data of the user.
       Microsoft Kinect SDK: I learnt basic of extracting the skeletal joints and
       how to work with the recorded data and the class hierarchy of the
Microsoft Kinect and there attributes, and understood the data flow. And
        then learnt C# .Using Kinect SDK I wrote few code blocks to add the
        sample and three complete programs for skeletal tracking and user
        interface. Wrote the code for controlling the cursorand colouring the
        player detected with different colour. There screenshots can be found
        below. Understood how to load and work with 3D model into C# and
        control it with user motion and locked the default to the knee so the
        model doesn’t slip and has a more natural motion flow. Learnt and
        understood the sample for facial tracking and understood the data
        structure for getting the facial points.
        Ogre: Learnt the data flow of Ogre and explored just the tip of the ice
        burg of Ogre as my part was just the tracking.
Initial Software Development tools (SDKs) suggested: OpenNI, Ogre and Maya

Installation of OpenNI:

In Ubuntu:-

First the following requirements need to be installed to install OpenNI.

Requirements:

1. GCC 4.x ( http://gcc.gnu.org/releases.html )

sudo apt-get install g++

2) Python 2.6+/3.x ( http://www.python.org/download/ ) This may already be installed, based on
the linuxdistro being used.

sudo apt-get install python

3) LibUSB 1.0.8 ( http://sourceforge.net/projects/libusb/ )

sudo apt-get install libusb-1.0-0-dev

4) FreeGLUT3 ( http://freeglut.sourceforge.net/index.php#download )

sudo apt-get install freeglut3-dev

5) JDK 6.0 ( http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u26-download-
400750.html )

sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner"

sudo apt-get update

sudo apt-get install sun-java6-jdk

Optional Requirements (To build the documentation):
1) Doxygen( http://www.stack.nl/~dimitri/doxygen/download.html#latestsrc )

sudo apt-get install doxygen

2) GraphViz( http://www.graphviz.org/Download_linux_ubuntu.php )

sudo apt-get install graphviz

Optional Requirements (To build the Mono wrapper):

1) Mono ( http://www.go-mono.com/mono-downloads/download.html )

sudo apt-get install mono-complete

Download OpenNI Modules:

Download the OpenNI modules appropriate for your operating system
fromhttp://75.98.78.94/Downloads/OpenNIModules.aspx

Download the latest unstable binaries for these:

   OpenNI binaries
   OpenNI compliant middleware binaries
   OpenNI compliant hardware binaries
For Ubuntu Linux 12.04 64bit, the files will be: (as of 19 June 2012)

   openni-bin-dev-linux-x64-v1.5.4.0.tar
   nite-bin-linux-x64-v1.5.2.21.tar.bz2
  sensor-bin-linux-x64-v5.1.2.1.tar.bz2
Make a new folder called kinect

mkdirkinect

cdkinect

Extract the downloaded files into it. (3 folders are now created).

 1. OpenNI-Bin-Linux64-v1.5.4.0
 2. Sensor-Bin-Linux64-v5.1.2.1
 3. Nite-1.5.2.1
Rename the folders as, OpenNI, Sensor, Nite respectively.

Install OpenNI and Sensor Kinect: (run sudo ./install.sh )

cdkinect

cd OpenNI

sudo ./install.sh

(Every step should now show OK! )

cd ../Sensor

sudo ./install.sh
(Every step should now show OK! )

Install NITE:

cd ../Nite

Enter the kinect/Nite/Data folder and edit each of the three xml files in there changing the key data
from

key=""

to

key="0KOIk2JeIBYClPWVnMoRKn5cdY4="

(useGedit)

sudo ./install.sh

(Every step should now show OK! )

Test if install succeeded:

Test out some samples from OpenNI. Run Niviewer

cd ~/kinect/OpenNI/Samples/Bin/x64-Release/

./NiViewer

(If a Kinect is connected, this will show depth map and image stream in a window)

Download Sample streams from OpenNI.

If kinect is not connected, you can run NiViewer on some pre-recorded .oni files from OpenNI.

ONI Files (OpenNI sample streams recorded onto a file)

http://75.98.78.94/Downloads/OpenNIModules.aspx (extract as skeletonrec.oni and
MultipleHands.oni)

Now run NiViewer from the ~/kinect/OpenNI/Samples/Bin/x64-Release/ folderwith the oni file as
argument.
./NiViewer ~/skeletonrec.oni

(This will show a window with the sample)

(skeletonrec.oni from NiViewer)

Sample Program: (Sample-NiUserTracker)

Run similar to NiViewer

./Sample-NiUserTracker ~/skeletonrec.oni
(skeletonrec.oni from Sample-NiUserTracker)

**********************************************************************************
Installation in windows:

OpenNI and NITE installation can be painful if not done properly. Let's do it step by step:

Step 0

Uninstall any previews drivers, such as CLNUI. Look at the end of this post if you want to
see how you can have multiple drivers installed.

Step 1

       Download Kinect Drivers and unzip.
       Open the unzipped folder and navigate to Bin folder.
       Run the msi Windows file.

Drivers are now installed in your PC.

Step 2

Download and install the latest stable or unstable OpenNI Binaries from OpenNI website.

Step 3

Download and install the latest stable or unstable OpenNI Compliant Middleware
Binaries (NITE) from OpenNI website.

During installation, provide the following (free) PrimeSense
key: 0KOIk2JeIBYClPWVnMoRKn5cdY4=

Step 4

Download and install the latest stable or unstable OpenNI Compliant Hardware
Binaries from OpenNI website.

Both stable and unstable releases have worked for me. If you have trouble installing the
unstable releases, just try the stable ones.

Step 5

       Plug in your Kinect device and connect its USB port with your PC.
       Wait until the driver software is found and applied.
       Navigate to the Device Manager (Control Panel). You should see something like
       the following:

Step 6

        C:Program
Navigate to
FilesOpenNISamplesBinRelease (or C:Program Files
(x86)OpenNISamplesBinRelease) and try out the existing demo applications.
Try the demos found in C:Program FilesPrime
SenseNITESamplesBinRelease (or C:Program Files (x86)Prime
SenseNITESamplesBinRelease), too. If they work properly, then you are done!
Congratulations!

Step 7

You have successfully installed Kinect in your Windows PC! Read the documentation and
familiarize yourself with the OpenNI and NITE API. You'll find the proper assemblies in:

        C:Program FilesOpenNIBin (or C:Program Files (x86)OpenNIBin)
        and
        C:Program FilesPrime SenseNITEBin (or C:Program Files
        (x86)Prime SenseNITEBin)

OpenNI is the primary assembly you'll need when developing Natural User Interfaces
applications.




Installation of OGRE :
For installation of Ogre in Linux and Windows the following link can be very useful .

http://www.ogre3d.org/tikiwiki/Installing+the+Ogre+SDK

With these suggested tools the Kinect can be very well exploited for Skeletal tracking and its
animation. These are limited to C++ till what I understood.

For starters the Microsoft Kinect SKD also is very useful and a very well developed tool .

For using the Kinect SDK just follow the link:

http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx



In the toolkit I went through the different examples and understood them and the code flow.

I decided of doing the project in C# rather than in C because in C# it was easily possible to import a
Maya or blender model easily in .fbx format and can be easily transformed in the sense of
orientation and bone movement.

Doing that I used Visual studio C# to track the Skeletal .

The model was imported and animated with the help of Microsoft XNA Game Studio SDK which is
basicallyused for and ease in importing the model.

This was done by the use of the libraries in the SDK called Microsoft.Kinect.
Other tools,SDKs Installed and used:

Blender: Linkhttp://www.blender.org/download/get-blender/

FbxConverter:Linkhttp://usa.autodesk.com/adsk/servlet/pc/item?id=10775855&siteID=123112

Unity:http://unity3d.com/unity/download/



The first skeletal tracking code written after going through the documentations and
understanding the class hierarchy, by me was for simple head and hand tracking, in C#.

Here is a snapshot of it in action.




Now Using C# I built on it the code for tracking a full player and colouring him in a specific colour
when detected. Note the different colour for the user and the background.
Using Avataring sample in Kinect SKD as a base in C# , I built on it, the code to track the model and
animate it with the motion of the human in front of the Kinect . Here are few snapshots of it.

Snapshot of the default model provide in the toolkit.
But notice that it has been looked in the Knee for a better ad natural flow.




This snapshot show the models natural like walking ability with respect to the user.
This depicts the neck and hand movement of the model with users movement.




This picture is for animation of a different character ( Zombie).
The picture below is for an improved version on my previous simple head and hand tracker. In this
improved code I can control the courser by my hand movement, where I have use Euclidian
distance between the head joint to the right hand cursor joint for moving the courser left or right.




For Facial tracking the Kinect SKD has been used. The samples were well understood and are quite
simple to begin with.The samples were well studied and understood. Here is the facial tracking
snapshot.




For facialanimation Ogre3D SDK has been studied and the facial points returned by the function are
theorised to the animate the following character in the model.
But unable to still match the face points with the vector position values of the actual face. This is still
something left to work on and can be done with a while of work and with some time in hand.

        Vipul Divyanshu

        IIL/2012/14

More Related Content

What's hot

Pic programming gettingstarted
Pic programming gettingstartedPic programming gettingstarted
Pic programming gettingstarted
Ajit Padmarajan
 

What's hot (6)

Sikuli
SikuliSikuli
Sikuli
 
Kinectic vision looking deep into depth
Kinectic vision   looking deep into depthKinectic vision   looking deep into depth
Kinectic vision looking deep into depth
 
Getting started with immersive technologies
Getting started with immersive technologiesGetting started with immersive technologies
Getting started with immersive technologies
 
Pic programming gettingstarted
Pic programming gettingstartedPic programming gettingstarted
Pic programming gettingstarted
 
Sikuli
SikuliSikuli
Sikuli
 
Sikuli script
Sikuli scriptSikuli script
Sikuli script
 

Similar to Vipul divyanshu documentation on Kinect and Motion Tracking

Kinect kunkuk final_
Kinect kunkuk final_Kinect kunkuk final_
Kinect kunkuk final_
Yunkyu Choi
 
Kinect installation guide
Kinect installation guideKinect installation guide
Kinect installation guide
gilmsdn
 
Xbox one development kit 2 copy - copy
Xbox one development kit 2   copy - copyXbox one development kit 2   copy - copy
Xbox one development kit 2 copy - copy
rojizo frio
 
Better With Friends: Android+NFC+Arduino
Better With Friends: Android+NFC+ArduinoBetter With Friends: Android+NFC+Arduino
Better With Friends: Android+NFC+Arduino
Pearl Chen
 
Corey.Berry.Portfolio.2016
Corey.Berry.Portfolio.2016Corey.Berry.Portfolio.2016
Corey.Berry.Portfolio.2016
Corey Berry
 

Similar to Vipul divyanshu documentation on Kinect and Motion Tracking (20)

Kinect kunkuk final_
Kinect kunkuk final_Kinect kunkuk final_
Kinect kunkuk final_
 
Kinect installation guide
Kinect installation guideKinect installation guide
Kinect installation guide
 
Kinect
KinectKinect
Kinect
 
Kinect
KinectKinect
Kinect
 
Concerto motionsummer2011week1
Concerto motionsummer2011week1Concerto motionsummer2011week1
Concerto motionsummer2011week1
 
Introduction to TensorFlow and OpenCV libraries
Introduction to TensorFlow and OpenCV librariesIntroduction to TensorFlow and OpenCV libraries
Introduction to TensorFlow and OpenCV libraries
 
2018/03/28 Sony's deep learning software "Neural Network Libraries/Console“ a...
2018/03/28 Sony's deep learning software "Neural Network Libraries/Console“ a...2018/03/28 Sony's deep learning software "Neural Network Libraries/Console“ a...
2018/03/28 Sony's deep learning software "Neural Network Libraries/Console“ a...
 
Kinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing ApplicationKinect Arabic Interfaced Drawing Application
Kinect Arabic Interfaced Drawing Application
 
Xbox one development kit 2 copy - copy
Xbox one development kit 2   copy - copyXbox one development kit 2   copy - copy
Xbox one development kit 2 copy - copy
 
Kinect Lab Pt.
Kinect Lab Pt.Kinect Lab Pt.
Kinect Lab Pt.
 
Xbox One Kinect
Xbox One KinectXbox One Kinect
Xbox One Kinect
 
Let’s get real: An introduction to AR, VR, MR, XR and more
Let’s get real: An introduction to AR, VR, MR, XR and moreLet’s get real: An introduction to AR, VR, MR, XR and more
Let’s get real: An introduction to AR, VR, MR, XR and more
 
Microsoft Kinect for Human-Computer Interaction
Microsoft Kinect for  Human-Computer InteractionMicrosoft Kinect for  Human-Computer Interaction
Microsoft Kinect for Human-Computer Interaction
 
Augmenting reality: Bring digital objects into the real world
Augmenting reality: Bring digital objects into the real worldAugmenting reality: Bring digital objects into the real world
Augmenting reality: Bring digital objects into the real world
 
Becoming a kinect hacker innovator v2
Becoming a kinect hacker innovator v2Becoming a kinect hacker innovator v2
Becoming a kinect hacker innovator v2
 
Hacking the Kinect with GAFFTA Day 4
Hacking the Kinect with GAFFTA Day 4Hacking the Kinect with GAFFTA Day 4
Hacking the Kinect with GAFFTA Day 4
 
Better With Friends: Android+NFC+Arduino
Better With Friends: Android+NFC+ArduinoBetter With Friends: Android+NFC+Arduino
Better With Friends: Android+NFC+Arduino
 
The not so short introduction to Kinect
The not so short introduction to KinectThe not so short introduction to Kinect
The not so short introduction to Kinect
 
Corey.Berry.Portfolio.2016
Corey.Berry.Portfolio.2016Corey.Berry.Portfolio.2016
Corey.Berry.Portfolio.2016
 
openGl configuration_in visual studio 2019.pptx
openGl configuration_in visual studio 2019.pptxopenGl configuration_in visual studio 2019.pptx
openGl configuration_in visual studio 2019.pptx
 

Recently uploaded

Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
ssuserdda66b
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

Recently uploaded (20)

Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 

Vipul divyanshu documentation on Kinect and Motion Tracking

  • 1. Skeletal Tracking and Facial Tracking andAnimation Project Documentation Vipul Divyanshu IIL/2012/14 Summer Internship Mentor: Imagineer India Innovation Labs Tasks at hand: *Skeletal Tracking *Facial Tracking *Skeletal animation and integration with the model(additional) Tools Explored: OpenNI Microsoft Kinect SDK Ogre 3D Unity Blender Analysis of the tools and what was explored: OpenNI:-OpenNI (Open Natural Interaction) is a multi-language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API that enables communication with both: Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.) Vision and audio perception middleware (the software components that analyse the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand detected within the image. OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to be implemented by the middleware components.
  • 2. A major plus point of OpenNI is that we can run the skeletal tracking work and work on the recorded Kinect video in .oni format. Constrains of OpenNI:- What I found out was for an easier start of project Microsoft Kinect SDK would be more useful for a full body skeletal tracking and its animation C#. This is because OpenNI cannot easily load a model in OpenNI directly from Maya or Blender, instead SDKs like Ogre 3D must be used. Microsoft Kinect SDK: -Microsoft Kinect come with classes for skeletal tracking and it is much easier to initialise Kinect and get the skeletal joint points just by calling the skeletal.joint in C#. For developing in C++ both Microsoft Kinect and OpenNI stand at same level. I developed in C# and found XNA and Microsoft Kinect very useful in both getting the skeletal data and interacting with the 3D model in .fbx format. Still the exploration of Kinect SDK in the facial animation field is left as it wasn’t a part my work package. FBX :-Fbx format used with C# is very flexible and is has no problem with object collision. Constrains:- Working directly with the recorded data from the Kinect Studio .xed hasn’t been well explored due to lack of material online and that can be restricting. For facial tracking, it cannot be detected if the eyes are open or not, rest other features are detected properly. Ogre 3D: -Ogre can be used in animation and in sync in the OpenNI or Kinect SDK .It wasn’t explored much by me. It has a vast amount of potential for facial animation. Constrains:-Requires a bit of detailed study about the model and the class hierarchy to perform the animation. It is easier for experienced person. Things Learnt and coded: OpenNI: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the OpenNI and there attributes, and understood the data flow. Wrote and analysed the code for the getting the skeletal data of the user. Microsoft Kinect SDK: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the
  • 3. Microsoft Kinect and there attributes, and understood the data flow. And then learnt C# .Using Kinect SDK I wrote few code blocks to add the sample and three complete programs for skeletal tracking and user interface. Wrote the code for controlling the cursorand colouring the player detected with different colour. There screenshots can be found below. Understood how to load and work with 3D model into C# and control it with user motion and locked the default to the knee so the model doesn’t slip and has a more natural motion flow. Learnt and understood the sample for facial tracking and understood the data structure for getting the facial points. Ogre: Learnt the data flow of Ogre and explored just the tip of the ice burg of Ogre as my part was just the tracking. Initial Software Development tools (SDKs) suggested: OpenNI, Ogre and Maya Installation of OpenNI: In Ubuntu:- First the following requirements need to be installed to install OpenNI. Requirements: 1. GCC 4.x ( http://gcc.gnu.org/releases.html ) sudo apt-get install g++ 2) Python 2.6+/3.x ( http://www.python.org/download/ ) This may already be installed, based on the linuxdistro being used. sudo apt-get install python 3) LibUSB 1.0.8 ( http://sourceforge.net/projects/libusb/ ) sudo apt-get install libusb-1.0-0-dev 4) FreeGLUT3 ( http://freeglut.sourceforge.net/index.php#download ) sudo apt-get install freeglut3-dev 5) JDK 6.0 ( http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u26-download- 400750.html ) sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner" sudo apt-get update sudo apt-get install sun-java6-jdk Optional Requirements (To build the documentation):
  • 4. 1) Doxygen( http://www.stack.nl/~dimitri/doxygen/download.html#latestsrc ) sudo apt-get install doxygen 2) GraphViz( http://www.graphviz.org/Download_linux_ubuntu.php ) sudo apt-get install graphviz Optional Requirements (To build the Mono wrapper): 1) Mono ( http://www.go-mono.com/mono-downloads/download.html ) sudo apt-get install mono-complete Download OpenNI Modules: Download the OpenNI modules appropriate for your operating system fromhttp://75.98.78.94/Downloads/OpenNIModules.aspx Download the latest unstable binaries for these: OpenNI binaries OpenNI compliant middleware binaries OpenNI compliant hardware binaries For Ubuntu Linux 12.04 64bit, the files will be: (as of 19 June 2012) openni-bin-dev-linux-x64-v1.5.4.0.tar nite-bin-linux-x64-v1.5.2.21.tar.bz2 sensor-bin-linux-x64-v5.1.2.1.tar.bz2 Make a new folder called kinect mkdirkinect cdkinect Extract the downloaded files into it. (3 folders are now created). 1. OpenNI-Bin-Linux64-v1.5.4.0 2. Sensor-Bin-Linux64-v5.1.2.1 3. Nite-1.5.2.1 Rename the folders as, OpenNI, Sensor, Nite respectively. Install OpenNI and Sensor Kinect: (run sudo ./install.sh ) cdkinect cd OpenNI sudo ./install.sh (Every step should now show OK! ) cd ../Sensor sudo ./install.sh
  • 5. (Every step should now show OK! ) Install NITE: cd ../Nite Enter the kinect/Nite/Data folder and edit each of the three xml files in there changing the key data from key="" to key="0KOIk2JeIBYClPWVnMoRKn5cdY4=" (useGedit) sudo ./install.sh (Every step should now show OK! ) Test if install succeeded: Test out some samples from OpenNI. Run Niviewer cd ~/kinect/OpenNI/Samples/Bin/x64-Release/ ./NiViewer (If a Kinect is connected, this will show depth map and image stream in a window) Download Sample streams from OpenNI. If kinect is not connected, you can run NiViewer on some pre-recorded .oni files from OpenNI. ONI Files (OpenNI sample streams recorded onto a file) http://75.98.78.94/Downloads/OpenNIModules.aspx (extract as skeletonrec.oni and MultipleHands.oni) Now run NiViewer from the ~/kinect/OpenNI/Samples/Bin/x64-Release/ folderwith the oni file as argument.
  • 6. ./NiViewer ~/skeletonrec.oni (This will show a window with the sample) (skeletonrec.oni from NiViewer) Sample Program: (Sample-NiUserTracker) Run similar to NiViewer ./Sample-NiUserTracker ~/skeletonrec.oni
  • 7. (skeletonrec.oni from Sample-NiUserTracker) ********************************************************************************** Installation in windows: OpenNI and NITE installation can be painful if not done properly. Let's do it step by step: Step 0 Uninstall any previews drivers, such as CLNUI. Look at the end of this post if you want to see how you can have multiple drivers installed. Step 1 Download Kinect Drivers and unzip. Open the unzipped folder and navigate to Bin folder. Run the msi Windows file. Drivers are now installed in your PC. Step 2 Download and install the latest stable or unstable OpenNI Binaries from OpenNI website. Step 3 Download and install the latest stable or unstable OpenNI Compliant Middleware Binaries (NITE) from OpenNI website. During installation, provide the following (free) PrimeSense key: 0KOIk2JeIBYClPWVnMoRKn5cdY4= Step 4 Download and install the latest stable or unstable OpenNI Compliant Hardware Binaries from OpenNI website. Both stable and unstable releases have worked for me. If you have trouble installing the unstable releases, just try the stable ones. Step 5 Plug in your Kinect device and connect its USB port with your PC. Wait until the driver software is found and applied. Navigate to the Device Manager (Control Panel). You should see something like the following: Step 6 C:Program Navigate to FilesOpenNISamplesBinRelease (or C:Program Files
  • 8. (x86)OpenNISamplesBinRelease) and try out the existing demo applications. Try the demos found in C:Program FilesPrime SenseNITESamplesBinRelease (or C:Program Files (x86)Prime SenseNITESamplesBinRelease), too. If they work properly, then you are done! Congratulations! Step 7 You have successfully installed Kinect in your Windows PC! Read the documentation and familiarize yourself with the OpenNI and NITE API. You'll find the proper assemblies in: C:Program FilesOpenNIBin (or C:Program Files (x86)OpenNIBin) and C:Program FilesPrime SenseNITEBin (or C:Program Files (x86)Prime SenseNITEBin) OpenNI is the primary assembly you'll need when developing Natural User Interfaces applications. Installation of OGRE : For installation of Ogre in Linux and Windows the following link can be very useful . http://www.ogre3d.org/tikiwiki/Installing+the+Ogre+SDK With these suggested tools the Kinect can be very well exploited for Skeletal tracking and its animation. These are limited to C++ till what I understood. For starters the Microsoft Kinect SKD also is very useful and a very well developed tool . For using the Kinect SDK just follow the link: http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx In the toolkit I went through the different examples and understood them and the code flow. I decided of doing the project in C# rather than in C because in C# it was easily possible to import a Maya or blender model easily in .fbx format and can be easily transformed in the sense of orientation and bone movement. Doing that I used Visual studio C# to track the Skeletal . The model was imported and animated with the help of Microsoft XNA Game Studio SDK which is basicallyused for and ease in importing the model. This was done by the use of the libraries in the SDK called Microsoft.Kinect.
  • 9. Other tools,SDKs Installed and used: Blender: Linkhttp://www.blender.org/download/get-blender/ FbxConverter:Linkhttp://usa.autodesk.com/adsk/servlet/pc/item?id=10775855&siteID=123112 Unity:http://unity3d.com/unity/download/ The first skeletal tracking code written after going through the documentations and understanding the class hierarchy, by me was for simple head and hand tracking, in C#. Here is a snapshot of it in action. Now Using C# I built on it the code for tracking a full player and colouring him in a specific colour when detected. Note the different colour for the user and the background.
  • 10. Using Avataring sample in Kinect SKD as a base in C# , I built on it, the code to track the model and animate it with the motion of the human in front of the Kinect . Here are few snapshots of it. Snapshot of the default model provide in the toolkit.
  • 11. But notice that it has been looked in the Knee for a better ad natural flow. This snapshot show the models natural like walking ability with respect to the user.
  • 12. This depicts the neck and hand movement of the model with users movement. This picture is for animation of a different character ( Zombie).
  • 13. The picture below is for an improved version on my previous simple head and hand tracker. In this improved code I can control the courser by my hand movement, where I have use Euclidian distance between the head joint to the right hand cursor joint for moving the courser left or right. For Facial tracking the Kinect SKD has been used. The samples were well understood and are quite simple to begin with.The samples were well studied and understood. Here is the facial tracking snapshot. For facialanimation Ogre3D SDK has been studied and the facial points returned by the function are theorised to the animate the following character in the model.
  • 14. But unable to still match the face points with the vector position values of the actual face. This is still something left to work on and can be done with a while of work and with some time in hand. Vipul Divyanshu IIL/2012/14