For the last couple of years (!) we have been discussing on and off about automatically flying the Crazyflie from a PC using OpenCV and a camera. We did a try a while ago using a PS3 Eye camera that wasn’t very successful. One of the issues we were having was detecting the distance from the Crazyflie to the camera. Another issue was a lag of about 1 second which made it impossible to control the Crazyflie using the video as input. So last week this discussion came up again and we finally decided to buy a Xbox360 Kinect. The image resolution is higher than the PS3 Eye and of course it has the ability to detect the distance from the camera to the objects it’s seeing.
The goal was to create a proof-of-concept application that shows that it’s possible as well as providing a stub for anyone that’s interested in doing more development. The application uses the normal image as well as the depth image from the Kinect to estimate where the Crazyflie is. This can easily be done by attaching a small colored ball to the Crazyflie and using a white background for flying. The images are processed and the current X/Y/Z co-ordinates given to a control loop. The control loop consists of thee PID regulators that will correct the roll/pitch/thrust that is sent to the Crazyflie to reach a X/Y/Z set-point in the image. The X/Y set-point can be selected by clicking the image or you can hold down the mouse button and drag it around. A cool feature that we would like in the future is to draw a geometric shape that the Crazyflie could follow. But there’s still a lot of work to be done with the control loop before we can achieve that.
After seeing the video you might be asking yourself where the James Bond theme music or the inverted pendulum is. Well, we are’t quite there yet and we will probably not even get near it. But still, this shows that the concept works. And for anyone that’s interested, it’s possible to do some basic trajectory planning algorithms at home with a Crazyflie and a Kinect.
For a more technical details have a look at the Kinect page in the hacks section of the wiki.
9 comments on “Autopilot using Kinect and a PC”
Can this also be used for an autonomous system?
The control of the Crazyflie is done via the Crazyflie Python API, but there’s also community versions available for C and Ruby. As long as you can interface any of these libraries for sending control commands to the Crazyflie you can use any lib that will do the image processing and autopilot regulation on the PC.
What spec of your computer in the video?
I got FPS around 15.
I’m working on laptop with
OS: xubuntu 13.04
CPU: i5 2.40GHz
RAM: 4 GB
We were running it on a computer with the following spec.
OS: Ubuntu 13.04
CPU: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
RAM: 16 GB
15 FPS might be a bit low to get this working properly. There might be some optimizations to do so I’ve added an issue to investigate it.
Thank for your reply.
My computer is quite slow than you.
It just 1st gen Intel i5-450m.
I’ll try to code on another computer :).
good work!!! I would ask you how do you control the yaw! Manually, trying to set it always at 0, or in another way?
When flying with the Kinect the target yaw is always 0, so the Crazyflie will keep the yaw at 0. Without having more markers than just the red ball it will be hard to detect the rotation of the platform.
So just putting yaw at zero ( the crazyflie is controlled by yaw o yaw rate??) does the real yaw keep the initial position without any drift or error?
Yes, just putting the yaw at zero will keep the Crazyflie from rotating. There’s some drift over time, but when we tested it wasn’t a problem. It’s the yaw rate that is controlled.