We’re still busy with administrative stuff and preparing everything for release so sorry for the lack of tech posts. Hopefully there will be more time for those later :-)
But we did spend one night this week trying out something that we have talked about forever: Using OpenCV to auto-pilot the Crazyflie. For controlling the Crazyflie from a Python scripts it’s just a couple of lines and then you are ready to go. Add some object tracking to that and you can make an autonomous Crazyflie…or you could make a crashing one like the video below… The video is shot using a Playstation Eye lying on the floor. The camera has good potential for tracking since it’s low resolution, cheap and can do up to 120 fps. The plan is to use the size of the detection to control the thrust and the center of it to control the roll and pitch.
Unfortunately the latency was too large for doing a control loop for roll/pitch/thrust so it crashes. But hopefully in the future we, or someone in the community, will have some more time to spend on this. We think that it definitely has potential!
Part of this test was also to have another project where we use the Crazyflie Python API to make sure that it can easily be dropped into other projects.
11 comments on “Crazyflie OpenCV tracking test”
that’s great try but seems unusual to track vehicle from stationary camera.
It might be better to track env from vehicle camera like these pals did
Does it possible to compress such low dimension images with onboard STM32 chip? How much spare milliseconds \ ticks left?
Or quad might only carry separate device camera with RF?
Vicon, http://www.vicon.com/, is a stationary motion tracing system commonly used, but you have a good point. Ultimately you want the quad to track itself.
You could probably do some low resolution image processing on the STM32 chip, there is about 60% CPU free. The difficulty is to make a camera add-on light/small enough and still making it affordable.
My idea for this would be using IR-LEDs on the Quadrokopter and a Wiimote on the ground.
That would probably be equally good, at least to some extent.
Just out of pure curiosity: any chances for source code? (I don’t mind ugliness)
I’m afraid you will have to wait but we are getting pretty close to a first release now, just a couple of weeks (but not promising anything).
If it’s just the motion-tracking code for OpenCV that you are interested in then you can have a look here: http://stackoverflow.com/questions/3374828/how-do-i-track-motion-using-opencv-in-python
@tobias totally understand. Will wait. In the meantime, I’ll go through the link that Marcus posted. Thanks guys.
The code has been released along with the pre-ordering. We have to be better advertising it. We use bitbucket: https://bitbucket.org/bitcraze
My guess is, you can do decent control in the X-Y (camera view) plane, but size estimation is going to be too uncertain to pinpoint the distance from the camera well.
For short distances you can simply use a second camera, and triangulate their images. Longer baseline will give you better accuracy, up to the point where you effectively point the cameras at 90 degrees and define a volume to fly within.
On 3-2 you indicated that the code has been released,
I downloaded it and got it to work so far.
But I cannot find the code for the OpenCV tracking test.
Where can I find that code on BitBucket ?
Comments are closed.