Raspberry Pi with Kinect

Well, as far as I know there are no successful stories about getting images from Kinect on RaspberryPi.

On github there is an issue in libfreenect repository about such problem. In this comment user zarvox say that RPi haven't enough power to handle data from Kinect.

Personally I tried to connect Kinect with RPi using OpenNI2 and Sensor, but have no success. And that was not a clever decision because it's impossible to work with Microsoft Kinect on Linux using OpenNI2 due to licensing restrictions (Well, actually it is not so impossible. You can use OpenNI2-FreenectDriver + OpenNI2 on Linux to hookup Kinect. But anyway this workaround is not suitable for RaspberryPi, because OpenNI2-FreenectDriver uses libfreenect).

But anyway there are some good tutorials about how to connect ASUS Xtion Live Pro to RaspberryPi: one, two. And how to connect Kinect to more powerfull arm-based CubieBoard2: three.


If you can manage to plug your kinect camera to the raspberry Pi, install guvcview first to see if it does works.

sudo apt-get install guvcview

Then, typeguvcview in the terminal and it should open an option panel and the camera control view. If all of that does works and that you want to get the RAW data to do some image treatments, you will need to compile OpenCV (it takes 4 hour of compiling) and after that, you just will need to program whatever you want. To compile it, just search on Google, there are lots of tutorial.


To answer your question, yes it is possible to get Image and depth on the raspberry pi!

Here is how to.

If you want to use just video (color, not depth) there is already a driver in the kernel! You can load it like this:

modprobe videodev
modprobe gspca_main
modprobe gspca_kinect

You get a new /dev/videoX and can use it like any other webcam!

If you need depth (which is why you want a kinect), but have a kernel older than 3.17, you need another driver that can be found here: https://github.com/xxorde/librekinect. If you have 3.17 or newer, then librekinect functionality is enabled by toggling the gspca_kinect module's command-line depth_mode flag:

modprobe gspca_kinect depth_mode=1

Both work well on the current Raspbian.


If you intend to do robotics the simplest thing is to use the Kinect library on ROS Here

Oderwise you can try OpenKinect, They provide the libfreenect library that let you acess to the accelerometers the image & much more

OpenKinect on Github here

OpenKinect Wiki here

Here is a good exemple with code & all the details you need to connect to the Kinect & operate the motors using libfreenect.

You will need a powered USB hub to power the Kinect & to install libusb.

A second possiblity is to use the OpenNI library which provides a SDK to develop midleware libraries to interface to your application there is even an OpenNi lib for processing here.