Category: Video

It’s been a while since we summed up things happening in the community so here’s some of the things that are happening. There’s lots of more things, so if you think we are missing something, then post it in the comments below.

Ralph has been doing some work on an semi-automatic flip feature in the client. There’s more info on the forum and video below.

Last week we tested some modifications made by otto for a headfree mode (i.e yaw only rotates the platform, not the referance direction). It’s a really nice feeling just rotating without taking care of the direction you are going in :-) There’s more information and links to code on the forum.

The SHERPA project have been working on swarm algorithms using a vision system and the Crazyflie.

Geof from Centeye have been working on optical flow stabilization using the Crazyflie. He has a prototype board working and there’s lots of information in the forum about this build. To see the results have a look at the video below.

Thanks to Victor the Deviation firmware for Devo-7e (custom firmware for Devo RC-controllers) now has support for the Crazyflie (needs hardware hack). If you would like to give it a try have a look at the code or grab one of the nightlies.

Researchers at the University of Tokyo have been testing a new concept for a HoverBall using the Crazyflie. Imagine throwing a ball into the air that doesn’t come down (well not right away at least..). Here’s some more info and a picture.

We have also seen some nice stand alone controllers for the Crazyflie, one by  MidLifeCrisis (more info here) and one by ivandevel (video below) . There’s also more info in the forum.

There also some updates on the work done by Oliver on the Kinect tracking of the Crazyflie. A demo video is shown below (it looks great!) and there’s more information on the forum.

And finally here’s a nice video we found on Youtube showing position control of the Crazyflie using a VICON system.

Continuing our post last week here’s a video showing the Adafruit NeoPixel ring and some of the effects that we did. The client code still needs some cleaning up, but the firmware is pushed here.

The firmware implementation includes two parameters, ring.neffect and ring.effect. The first parameter is used to know how many effects are implemented and the second one is used to set the index of the effect that should be used. For our current implementation we just loop though all the effects with either a joystick button or using the thumb together with the Leap Motion. It’s also possible to use these parameters directly from the UI as seen in this blog post.

Params for NeoPixel

 

Some of the effects we implemented are just blinking patterns, but we also wanted the ability to show feedback from the firmware using the ring. This has been enabled by adding an API in the firmware to access variables exposed though the logging framework. Using this API the firmware can access the same variables in the logging as the client. This enables all kinds of fun possibilities. In the video you can see (blurred) our implementation showing which direction the Crazyflie is tilting, but we have also been working on showing thrust (think rocket :-) ) as well as other things. We have done our best to implement a couple of effects, but there is so much more cool effects that could be done. So the firmware allows for custom effects and also for mixing multiple effects together (see this code).

A while ago we posted a tutorial on how to modify the firmware to add logging/parameters and to plot/modify them from the client. We have done a continuation on this tutorial to show how to modify the client to integrate logging and parameters directly into the UI (like we have done on the flight tab). For the tutorial we use our virtual machine to do the development and running the code. Since we continue on the concepts and design made in the first video, it might be a good idea to see that one first.

 

A while ago I started working on a brushless motor control driver for the Crazyflie. I implemented most of it but did not really have time to test it. Recently we have gotten some request and questions about it so we took some time to do some further testing.

Implementing a brushless motor control driver can be done in many ways. If you have brushlesss motor controllers that can be controlled over I2C that could have been one way but usually the brushless motor controller (BLMC) take a PWM input. This is most commonly a square wave with a period of 20ms and a pulse width of 1-2 ms high, were 1 ms is 0%, and 2 ms is 100%. A period of 20 ms means a frequency of 50Hz. This is most often a high enough update rate for R/C electronics like servos etc. but when it comes to BLMC that is not the case. Therefore many new BLMC can read a much higher update rate of up to 400 Hz were the pulse still is 1-2 ms high. That way you can match the BLMC input to the update rate of the stabilization control loop and increase stability. In the code we added a define BLMC_PERIOD where this can be set.

To test this we wanted a frame which was quick to setup and found this. It is based of a PCB just like the Crazyflie and has the four motor controllers with it, perfect! The built in BLMC are based on an the Atmel MCU Atmega8 which is very commonly used in the R/C BLMC which means it is possible to re-flash them with the SimonK firmware. This is know to be a great firmware and enables fast PWM update rate etc. So we built and flashed the firmware configured for the tgy6a which is compatible and it worked right away, yay!

Now we only had to connect the Crazyflie to the BLMC:s on the frame. The BLMC electrical interface for the PWM signal is often a 5V interface but the Crazyflie runs on 2.8V. 2.8V would in most cases be treated as an high input and can probably be used directly but there is no simple way to connect this signal on the Crazyflie. Instead one way is to use the existing motor connectors and the pull-down capability that is already there. Then it is also possible to pull this signal to 5V with a resistor to get a 5V interface so this is what we did. To power the Crazyflie we took the connector of an old battery and soldered it the 5V output of the frame.CF to BL Frame connections

Now it was just a matter of testing it! However as size increases so does the potential damage it can make. We therefore took some precaution and tied it down. First we tested the stability on each axis using the stock values and it worked really well so we decided to not tune it further. The only issue was that suddenly one of the BLMC mosfets burnt. We replaced it and it worked again but don’t know why it burnt. Later when we flew it something was still strange so we have to investigate this.

We will upload the code as soon as it has been cleaned up. Please enjoy a short video of the journey :)

A couple of weeks ago we moved into the Malmö based business incubator Minc. It’s a great chance to get some extra help developing Bitcraze and also to get a chance to meet people from other start-ups. Below is a picture of a bunch of Crazyflies in our lab (i.e the table designated lab).

Bitcraze Lab

On a more technical note, here’s a video of a blimp that was hacked together with some left over party-balloons and a broken Crazyflie that we had lying around. It’s the day after the party, so there wasn’t that much lift left in the balloons (that’s why we have a bunch of them). After a few hacks to the firmware it actually works pretty well! The motors power has been redistributed and only the Yaw regulation is active, which explains that the yaw is still pretty stable, we are thinking about pushing the ‘blimp-mode’ at some point (first we will need more balloons and Helium though :) ).

Before it’s time for the holidays we thought that we would do a new video with a Christmas theme, showing our vision of Christmas gift delivery. For anyone that has seen the Amazon Prime Air video, it’s pretty easy to see where we got the inspiration for our video. Needless to say we didn’t have the same budget as Amazon, but to be fair our quad does cover more ground :-)

Merry Christmas, and don’t forget our holiday challenge!

This Monday post we are devoting to the community development and we will try to give a short summary of what is going on there. We recently haven’t our selves had that much time to help out with this development, something we intend to change, so all credit goes to the community!

  • A port of the OpenPilot CC3D firmware to the Crazyflie done by webbn. Still under development but video already shows promising results.
  • Altitude hold functionality which is being developed in parallel by many, omwdunkley, phiamo, et. al. We hope we soon can contribute to this as well.
  • Improved thrust control which is being discussed a lot and hopefully we will soon see some ideas realized.
  • A Ruby cfclient written by hsanjuan.
  • The Android client with a lot of work from derf and sebastian.
  • The FPV implementation driven mainly by omwdunkley and SuperRoach. Omwdunkley has made an awesome HUD (Heads Up Display) which we hopefully sometime will see integrated into the cfclient. Check out the video!
We have probably forgotten some of the great development that has been going on recently and if that is the case please write a comment about it and we will update the post with it.

For the last couple of years (!) we have been discussing on and off about automatically flying the Crazyflie from a PC using OpenCV and a camera. We did a try a while ago using a PS3 Eye camera that wasn’t very successful. One of the issues we were having was detecting the distance from the Crazyflie to the camera. Another issue was a lag of about 1 second which made it impossible to control the Crazyflie using the video as input. So last week this discussion came up again and we finally decided to buy a Xbox360 Kinect. The image resolution is higher than the PS3 Eye and of course it has the ability to detect the distance from the camera to the objects it’s seeing.

Crazyflie and Kinect

The goal was to create a proof-of-concept application that shows that it’s possible as well as providing a stub for anyone that’s interested in doing more development. The application uses the normal image as well as the depth image from the Kinect to estimate where the Crazyflie is. This can easily be done by attaching a small colored ball to the Crazyflie and using a white background for flying. The images are processed and the current X/Y/Z co-ordinates given to a control loop. The control loop consists of thee PID regulators that will correct the roll/pitch/thrust that is sent to the Crazyflie to reach a X/Y/Z set-point in the image. The X/Y set-point can be selected by clicking the image or you can hold down the mouse button and drag it around. A cool feature that we would like in the future is to draw a geometric shape that the Crazyflie could follow. But there’s still a lot of work to be done with the control loop before we can achieve that.

After seeing the video you might be asking yourself where the James Bond theme music or the inverted pendulum is. Well, we are’t quite there yet and we will probably not even get near it. But still, this shows that the concept works. And for anyone that’s interested, it’s possible to do some basic trajectory planning algorithms at home with a Crazyflie and a Kinect.

For a more technical details have a look at the Kinect page in the hacks section of the wiki.

The first time we saw the video for the Leap Motion we instantly thought “Wow, we have to fly the Crazyflie with this!”. So finally last week a fresh Leap Motion landed on our desk and we went to work. We were really happy to see that there was an SDK for Linux. Leap Motion listened to the community that thought that an early release of the code was better than nothing at all. The API is very nice to work with and you can easily get metrics for you hand (like elevation and roll/pitch). So we created a Leap Motion driver in the joystick layer that replaces the normal input device, and it’s flying! We are going to be honest, at first it wasn’t that fun. It was more the concept that was very exciting. But as we flew more and more it started getting really great! There’s just something magical about it :-)

You fly the Crazyflie using a single hand. Thrust is calculated using the elevation of you hand from the sensor, so holding your hand higher means larger thrust. Pitch/roll is controlled by tilting you hand the same way you want the Crazyflie platform to tilt. So tilting your hand forward tilts the Crazyflie forward. Lastly the yaw is controlled by rotating your hand in the X/Y plane (around the Z-axis). For thrust/pitch/roll the control values are absolute, but for yaw it’s the rate that you control (just like the normal controls in the Crazyflie client).

As a safety mechanism the Crazyflie will only respond if the Leap Motion detects 4 or more fingers. So holding your fist above the sensor does nothing, but the instant you open up your hand you can start to control the Crazyflie.

The code for flying with the Leap Motion is still on a development branch but if you are eager to try it then have a look at the instructions in the hacks section of the wiki for documentation on how to get flying! The plan is to merge the Leap Motion code into the main track once we are finished with the development.

Crazyflie and Leapmotion

The community interest for FPV flight made us buy a light weight analog FPV kits a while a go. We bought the 5.8GHz micro combo set from fpvhobby to do a first test. It took a little while to solder things together but it wasn’t that bad. We soldered the camera and the transmitter power to the VCOM voltage on the Crazyflie which is available on the expansion port. Then we taped the camera and transmitter to the Crazyflie and did a test flight.  The FPV kit is only about 3 grams which doesn’t affect the flight performances that much.

It was almost too easy, but as soon as we took off we noticed some vertical lines on the TV so maybe it wasn’t that easy. There seems to be some electrical interference, probably from the motors. Also the battery voltage drop generated by fast accelerations cuts the video feed after a couple of minutes of flight. We still think that there are some improvements to be done so hopefully it will work better in the future.

As for the production, so far it is still going forward as planned.

Finally an Alpha version of the virtual machine has been posted on the forum, you are welcome to test and report any problem/suggestion you may have :-).