Category: Electronic

We have briefly mentioned the Active marker deck earlier in our blog and in this post we will describe how it works and what it is all about.

The Active marker deck is a result of our collaboration with Qualisys, a Swedish manufacturer of high end optical tracking systems. Optical tracking systems are often referred to as motion capture (mocap) systems and are using cameras to track markers on an object. By using multiple cameras it is possible to calculate the 3D position of the markers and the object they are attached to with very high precision and accuracy. It is common to use mocap systems in robotic labs to track the position and orientation of robots, for instance quadrotors.

Passive markers

The most common marker type is the passive marker, that is reflective spheres that are attached to the robot. By using infrared flashes on the cameras, the visibility of the markers is maximized and it makes it easier for the system to detect and track them. We are selling the Motion capture marker deck to make it easy to attach markers to a Crazyflie.

To get the full pose (position, roll, pitch, yaw) of a robot, the markers must be placed in a configuration that makes it possible for the mocap system to identify the orientation. This means that there must be some asymmetry in the marker positions to understand what is front, back, up, down and so on.

With a swarm of Crazyflies, unique marker configurations makes it possible to distinguish one individual from another and track all drones simultaneously. With a larger number of robots it becomes cumbersome though to place markers in unique configurations, and one approach to solving this problem is to have known start positions for all individuals and keep track of their motions over time instead. This solution is used in the Crazyswarm for instance and all Crazyflies can use the same marker configuration in this setup. Another approach is to make it possible to distinguish one marker from another, enter the Active marker deck.

Active markers

It is possible to use infrared LEDs instead of the passive markers, this is called active markers. The LEDs are triggered by the flash from the cameras and they are easily detected as strong points of light. Since they are emitting light they can be detected further away from the camera than a passive marker and the smaller physical size also keeps them more separated when they are far away and only a few pixels are available to detect them in the camera.

Furthermore Qualisys has a technology that makes it possible to assign an id to each marker and that enables the tracking system to identify individual markers and thus uniquely identify individuals in a swarm. With different IDs on the markers, there is no need have asymmetrical configurations and the marker layout can be the same on all drones. It also reduces the risk of errors in the estimated pose, since there is more information available.

The deck

The Active marker deck is designed to go on top of the Crazyflie and has four arms with one LED each. The arms are as long as possible to maximize the signal/noise ratio in the cameras, while still short enough to be protected from crashes by the motors. There is a STM32 F0 on the deck that takes care of the LEDs and handling of IDs and the main Crazyflie CPU does not have to spend any time on this.

The status of the deck is that the hardware is fully functional (we might want to move something around before we produce it though) and that there is a basic implementation of the firmware. IDs are assigned to the markers using parameters in the standard parameter framework in the client or from a script.

We will start production of the deck in the near future and it will be available in the store this autumn. Qualisys added support for rigid bodies using active markers in V2019.3 of the QTM tracking software.

Summer is here and temperatures are rising. Since many of us will be on holidays, we will focus this quarter on a special summer clean up! See here what we are working on:

  • Fixing issues: This time we are aiming to close many of the issue tickets in our Github repositories, so that after the summer everything will run much more smoothly (we hope!). Definitely our test rig will come in very hand to sniff out more issues in terms of radio communication as well. You can help as well! Everybody who is using and developing on with the STM-firmware, NRF-firmware or python library, or anything else and is noticing issues, please make a ticket in that same Github repository (if you are familiar with the code) or post about it on our forum (if you do not know exactly what is going on). Together we can make the code better.
  • Lighthouse calibration: In March we released our lighthouse deck for positioning with the HTC Vive base stations. We did feel that the setup process could be improved further, since currently, the Crazyflies’ firmware must contain hardcoded information of the Steam VR’s base station position. We will try to apply the factory calibration direct from the Base stations itself. This will enable us to do 2 additional things: (1) The Crazyflie with the LH deck itself could be used to setup the Lighthouse system, so that SteamVR would not be necessary anymore. (2) Only 1 base station is needed for positioning instead of 2, which will improve the robustness in case of loss of visual-line-of-sight of one base station.
  • Documentation: We try to provide all the possible information for everybody to be able do anything they want with their Crazyflie. But with high flexibility comes great responsibility…. for proper documentation! We are planning to restructure all of our media outlets and try to improve the flow and level of detail for our users. We hope to make it easier for beginning developers to get started and more advanced developers to gain better understanding of the system in order to implement their own awesome ideas. So our very first step is to restructure and clean up the Bitcraze wiki and see where we can add more content.
  • Products: We have a lot of products coming out in the 2nd half of the year
    • AI Deck: We are working hard to get the AI deck all ready for production and we are estimating that they will be available for early access in late autumn. Keep a look out on our forum for regular updates on the progress!
    • Lighthouse breakout board: We made our first working prototype of the lighthouse breakout board, which should make it easier for the lighthouse positioning system to also work on other platforms than the Crazyflie.
    • Active Marker Deck: We are very much on on track with the Qualisys active marker deck! It should be available in the Autumn.
    • Crazyflie Bolt: This has been send off to production for the early access version, which should be available in the Autumn!

We have now come to a the point were we will start manufacturing of the Crazyflie Bolt, Formally known as the RZR. You might wonder why we changed the name… Well the RZR more implies it is a Racer quad and it really isn’t. This is mainly because of the design in power distribution which is limited to around 8A per motor. However by using your own PDB it will work well for that too. But that is not the intention, it is more intended to have the strengths of the Crazyflie 2.1 but in a slightly bit bigger package. Therefore we wanted a better name for it and after a brainstorming session we came up with the name, Bolt. Both as it is a Crazyflie building block, a bolt used to fasten things, but also because it has the potential to be fast, as in a lightning bolt. Great name right :-)

The CF-Bolt development has been pushed back many times because of other more promising products, but finally it is getting here. If things goes according to plan, the Crazyflie Bolt should be in our shop in Aug-Sept. If you want to read up on the history and what it is all about read about the first flight to the almost-final prototype here.

A quick recap of the features:

  • Fully compatible with the CF2 firmware, expansion decks as well as radio.
  • Connectors to attach motor controllers (also possible to solder though) so it is easy to build and repair.
  • Power distributions built into controller board. (Max ~8A per motor controller) with XT30 connector.
  • Motor controllers can be switched off by the system (MOSFET) so the system can go into deep sleep and only consume around 50uA.
  • Voltage input 1S-4S (3V to 17V).
  • Standard mounting (M3 mounting holes spaced 30.5mm in a square).
  • External antenna for increased range.
  • SPI connected IMU (BMI088) for minimum latency.

While running our ICRA demo, we came across a bug in the Crazyflie python-lib radio handling, limiting the number of Crazyflie that could be controlled using one Crazyradio PA. Communication with many Crazyflies is crucial as flying swarms is becoming more of an interesting topic for research and education. So we decided to take the problem at hand and create a radio test-bench:

To make the test-bench we have attached 10 roadrunner boards to a plank of wood together with USB switches that can provide enough power to the roadrunners. We used the roadrunner because it is mechanically easier to use in this context and it has an identical architecture to the Crazyflie 2.1 when it comes to the radio implementation.

Initially we will use the test-bench to run test scripts that pushes the communication to its limits and that consistently test the communication stack functionalities. This should allow us to find bug and verify that we solve them as well as discovering and documenting limitations.

Eventually we want to connect a raspberry-pi to the test-bench and run tests for each commit and pull-request to the crazyflie-firmware, crazyflie2-nrf-firmware and crazyflie-lib-python projects. This will guarantee that we do not introduce new limitations in the communication stack. The test-bench will also be very useful in implementing new functionalities like direct crazyflie-to-crazyflie P2P communication.

As a final note, the Crazyswarm project is not affected by the Crazyflie-lib bug since it is using the C++ implemented crazyflie-ros driver. Hence Crazyswarm can control more Crazyflies per Crazyradio PA, so it is still the preferred way to fly a swarm mostly when using a motion capture system. Though, with the progress made on LPS and Lighthouse positioning, running swarms, using the python API directly is a probably a more lightweight alternative.

Only a week left until we stand in our ICRA booth in Montreal and give you a gimps of what we do here at Bitcraze. As we have been writing about earlier we are aiming to run a fully automated demo. We have been fine tuning it over the last couple of days and if something unpredictable doesn’t break it, we think it is going to be very enjoyable. For those that are interested in the juicy details check out this informative ICRA 2019 page, but if you are going to visit, maybe wait a bit so you don’t get spoiled.

Apart from the demo we are also going to show our products as well as some new things we are working on. The brand new things include:

AI-deck, Active marker deck and Lighthouse-4 deck
  • AI-deck: This is a collaborative product between GreenWaves Technologies, ETH Zurich and Bitcraze. It is based on the PULP-shield that the Integrated and System Laboratory has designed. You can read more about it in this blog post. The difference with the PULP-shield is that we have added a ESP32, the NINA-W102 module, so that video can be streamed over WiFi. This we hope will ease development and add more use cases.
  • Active marker deck: Another collaboration, but this time with Qualisys. This will make tracking with their motion capture cameras easier and better. Some more details in this blog post. Qualisys will have the booth just next to us were it will be possible to see it in a live demo!
  • Lighthouse-4 deck: Using the Vive lighthouse positioning system this deck adds sub-millimeter precision to the Crazyflie. This is the deck used in the demo and could become the star of the show.

Adding to the above we will of course also display our recently released products:

  • Crazyflie 2.1: The Crazyflie 2.1 is an improvement of the Crazyflie 2.0 but keeping backward capability.
    • Better radio performance and external antenna support: With a new radio power amplifier we’ve improved the link quality and added support for dual antennas (on-board chip antenna and external antenna via u.FL connector)
    • Better power button: We’ve gotten feedback that the power button breaks too easily, so now we’ve replaced with a more solid alternative.
    • Improved battery cable fastening: To avoid weakening of the cables over time they are now run through a cable relief.
    • Improved sensors: To make the flight performance better we’ve switched out the IMU and pressure sensor. The new Crazyflie uses the drone specialized sensor combo BMI088 and BMP388 by Bosch Sensortech.
  • Flow deck v2: The Flow deck v2 has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Z-ranger deck v2: The Z-ranger v2 deck has been upgraded with the new ST VL53L1x which increases the range up to 4 meters
  • Multi-ranger deck: The Multi-ranger deck adds VL53L1x sensors in all directions for mapping and obstacle avoidance.
  • MoCap marker deck: The motion capture deck with support for easily attachment of passive markers for motion capture camera tracking.
  • Roadrunner: The Roadrunner is released as early access and the hardware is basically a Crazyflie 2.1 without motors and up to 12V input power. This enables other robots or system to use the loco positioning system.

You can find us in booth 101 at ICRA 2019 (in Montral, Canada), May 20 – 22. Drop by and say hi, check out the products & demo and tell us what you are working on. We love to hear about all the interesting projects that are going on. See you there!

Hi everyone, here at the Integrated and System Laboratory of the ETH Zürich, we have been working on an exciting project: PULP-DroNet.
Our vision is to enable artificial intelligence-based autonomous navigation on small size flying robots, like the Crazyflie 2.0 (CF) nano-drone.
In this post, we will give you the basic ideas to make the CF able to fly fully autonomously, relying only on onboard computational resources, that means no human operator, no ad-hoc external signals, and no remote base-station!
Our prototype can follow a street or a corridor and at the same time avoid collisions with unexpected obstacles even when flying at high speed.


PULP-DroNet is based on the Parallel Ultra Low Power (PULP) project envisioned by the ETH Zürich and the University of Bologna.
In the PULP project, we aim to develop an open-source, scalable hardware and software platform to enable energy-efficient complex computation where the available power envelope is of only a few milliwatts, such as advanced Internet-of-Things nodes, smart sensors — and of course, nano-UAVs. In particular, we address the computational demands of applications that require flexible and advanced processing of data streams generated by sensors such as cameras, which is beyond the capabilities of typical microcontrollers. The PULP project has its roots on the RISC-V instruction set architecture, an innovative academic and research open-source architecture alternative to ARM.

The first step to make the CF autonomous was the design and development of what we called the PULP-Shield, a small form factor pluggable deck for the CF, featuring two off-chip memories (Flash and RAM), a QVGA ultra-low-power grey-scale camera and the PULP GAP8 System-on-Chip (SoC). The GAP8, produced by GreenWaves Technologies, is the first commercially available embodiment of our PULP vision. This SoC features nine general purpose RISC-V-based cores organised in an on-chip microcontroller (1 core, called Fabric Ctrl) and a cluster accelerator of 8 cores, with 64 kB of local L1 memory accessible at high bandwidth from the cluster cores. The SoC also hosts 512kB of L2 memory.

Then, we selected as the algorithmic heart of our autonomous navigation engine an advanced artificial intelligence algorithm based on DroNet, a Convolutional Neural Network (CNN) that was originally developed by our friends at the Robotic and Perception Group (RPG) of the University of Zürich.
To enable the execution of DroNet on our resource-constrained system, we developed a complete methodology to map computationally-intense deep neural networks on the PULP-Shield and the GAP8 SoC.
The network outputs two pieces of information, a probability of collision and a steering angle that are translated in dynamic information used to control the drone: respectively, forward velocity and angular yaw rate. The layout of the network is the following:

Therefore, our mission was to deploy all the required computation onboard our PULP-Shield mounted on the CF, enabling fully autonomous navigation. To put the problem into perspective, in the original work by the RPG, the DroNet CNN enabled autonomous navigation of big-size drones (e.g., the Bebop Parrot). In the original use case, the computational power and memory was not a problem thanks to the streaming of images to a remote base-station, typically a laptop consuming 30-100 Watt or more. So our mission required running a similar workload within 1/1000 of the original power.
To make this work, we combined fixed-point arithmetic (instead of “traditional” floating point), some minimal modification to the original topology, and optimised memory and computation usage. This allowed us to squeeze DroNet in the ultra-small power budget available onboard. Our most energy-efficient configuration delivers 6 frames-per-second (fps) within only 64 mW (including all the electronics on the PULP-Shield), and when we push the PULP platform to its limit, we achieve an impressive 18 fps within just 3.5% of the total CF’s power envelope — the original DroNet was running at 20 fps on an Intel i7.

Do you want to check for yourself? All our hardware and software designs, including our code, schematics, datasets, and trained networks have been released and made available for everyone as open source and open hardware on Github. We look forward to other enthusiasts contributions both in hardware enhancement, as well as software (e.g., smarter networks) to create a great community of people interested in working together on smart nano-drones.
Last but not least, the piece of information you all were waiting. Yes, soon Bitcraze will allow you to enjoy of our PULP-shield, actually, even better, you will play with its evolution! Stay tuned as more information about the “code-name” AI-deck will be released in upcoming posts :-).

If you want to know more about our work:

Questions? Drop us an email (dpalossi at iis.ee.ethz.ch and fconti at iis.ee.ethz.ch)

As part of our collaboration with Qualisys we are helping them developing an active marker deck for their motion capture cameras. One of the major benefits with an active marker deck is that it can have an ID, thus it is much easier to track each Crazyflie in e.g. a swarm. Another benefit is an increased range compared to passive markers thanks to high power emitting IR LEDs.

Active marker deck mounted on a Crazyflie

We are currently only in the prototype stage but we have already managed to do initial fight tests so hopefully we can release it within a couple of months.

We will bring some prototypes to ICRA 2019, come and visit us and Qualisys to check the deck out.

Last week we blogged about the early release version of the lighthouse deck and showed a nice push-around demo of the Crazyflies using the Vive controller. Now we wanted to push the system even further, by making a Lighthouse Painting!

We started by adding a LED-ring deck on the bottom of the CrazyFlie 2.1 with the lighthouse deck attached to the top. We were able to access the input of the track pad of the Vive controller and link it to a specific color / hue value. The LED ring can display any color possible in the RGB range, so in theory, you could paint in whatever color you like. For now, the brightness was fixed, but this could be easily added to the demo script as well.

To capture the light trace, we needed to make a long-exposure image, therefore, the flight arena need to stay completely dark. Luckily, this was easy to do for us since we do not have any windows in our new testing arena. Our camera is the Canon D5600 with a manually controlled shutter time setting selected (press to open the shutter and press again to close the shutter). The aperture setting was set at F-22. Nevertheless, this is very depended on the environment, so we had to do some trial-and-error in order to get this parameter right.

Aperture too wide… perfect!

Once we had the set-up finished, we made several long exposure photo paintings with one person controlling the camera and another painting the picture into thin air. Of course, the artist would need to imagine its creation, as we were not able to see the result until after the picture was taken. Also, big gestures were required in order to complete the painting, as the Crazyflie’s and the Vive controller’s movements were synced 1:1, so adding some multiplication factor would come in handy. Nonetheless, the results were amazing.

Some nice examples of a single crazyflie flying based on the Vive’s position, changing color based on the trackpad

We took it even further, by making the Crazyflie fly a predefined trajectory and planned color scheme without the Vive controller. First, it flew three concentric circles in green, red and blue with the high level commander with the PID controller setting. But, the circles would probably be closed-off more properly with the Mellinger controller setting. We also were able to reproduce the Bitcraze logo in the same fashion. In both long-exposure photos, it still possible to see the Crazyflie, as it is still traceable due to its routine LED functionality, so you can easily observe where it took off, and where it flew in between shapes.

The Crazyflie flying a predefined trajectory in several shapes

The demo python scripts of the above flights can be found here:

An we also took a video of the Bitcraze logo being drawn. The mobile phone camera had some problems focusing in the dark, but it gives a good idea of how things works:

The new Crazyflie 2.1

The Crazyflie 2.0 was released almost 4 years ago now. When we released it we wanted to avoid limiting our users in hardware. We over-designed it with lots of features and power we weren’t using at the time of release. We also put in the deck connector so we could keep users updated with new hardware without having to replace their Crazyflies.

Over the years there’s been thousands of users and lots of feedback on the product. Most of it great, but there’s of course also been issues that needed to be addressed. The original design concept is still working with new decks coming out and still free CPU cycles, flash and RAM. So instead of major updates we decided to focus on fixing the issues we’ve seen while keeping backwards compatibility for our users.

So today we’re really excited to announce we’ve released the Crazyflie 2.1! The updated version of the Crazyflie brings improved flight performance, better durability and improved radio stability.

Here’s a list of the updates:

  • Better radio performance and external antenna support: With a new radio power amplifier we’ve improved the link quality and added support for dual antennas (on-board chip antenna and external antenna via u.FL connector)
  • Better power button: We’ve gotten feedback that the power button breaks too easily, so now we’ve replaced with a more sturdy alternative.
  • Improved battery cable fastening: To avoid weakening of the cables over time they now run through a cable relief.
  • Improved sensors: To make the flight performance better we’ve upgrade the IMU and pressure sensor. The new Crazyflie uses the drone specialized sensor combo BMI088 and BMP388 by Bosch Sensortech. It lowers drift and avoids accelerometer saturation which makes the IMU more “trustable”.

It’s important to note that the Crazyflie 2.1 is a drop-in replacement for the Crazyflie 2.0. All spare parts and decks are compatible with both the Crazyflie 2.0 and the 2.1.

We even took it so far that the same binary can be flashed on the Crazyflie 2.0 and 2.1 without any special care. The binary will automatically activate the right drivers which means working with mixed groups of 2.0 and 2.1 isn’t a hassle.

When releasing the Crazyflie 2.1 we’ve also updated all the bundles to contain the new version. But even though you can’t get the bundles with the Crazyflie 2.0, there’s still some Crazyflie 2.0 units left from the last batch that can be purchased in the E-store.

We are glad to announce that we have manufactured the fist batch of Lightouse positioning decks and hopefully it will be ready to ship by the end of the month!

The Lighthouse positioning deck is a Crazyflie 2 deck capable of receiving IR signals from HTC Vive tracking base station (ie. Lighthouses). The basestations works by spinning IR laser beams that are received by the deck to measure the angle at which the base station sees the receiver. This allows the Crazyflie to estimate its position with great accuracy and so to fly autonomously.

The board we produced is very similar architecture-wise to the prototype we showed in previous blog posts. The main physical difference is that we now only have horizontal receivers. This change was made because we do not yet have a satisfactory mechanical solution to mount vertical IR receivers and we arbitrated that horizontal-only sensor already provides great performance for autonomous flight. Functionally it means that the Crazyflie should fly bellow the base stations to be able to position itself, we found that flying ~40cm bellow the base station gave good flying performance. We will continue looking at solution to make a deck with more receiver to increase the flight space in the future.

The lighthouse deck acquires the IR pulses transmitted by the lighthouses, the Crazyflie can then interpret these pulses to estimate its position. We also added soldering pads for a 2.54mm pin header which would allow to interface other microcontroller boards to the deck:

Lighthouse deck architecture

HTC has released 2 versions of the base stations that are incompatible with each other. Version 1 supports 2 base stations per system, and version 2 can support more than 2. We have good initial support for version 1 both in the deck and in the Crazyflie. Version 2 is currently being worked-on but early work shows that the deck should be compatible with version 2 with only a firmware update.

This leads to the current state of the product. The boards have been manufactured and we have received them but they are currently programmed with a test firmware. As previously stated the basic functionality is there but we still don’t have any finished bootloader. As soon as this is finished and tested we will start flashing all the boards. After that is is just a matter of adding them to the web-store stock and they will be ready to ship!

For now we consider this deck as early access, which means that we will document it in the wiki and that the software will still be heavily developed. For example an early limitation that will be worked-on is that it is currently required to run SteamVR on a computer to setup the system, this means that you need to have a full Vive VR setup or at least a vive gamepad or tracker to setup your flight space. Eventually we want to make it possible to setup the system with only base stations and a Crazyflie, without using steamVR.

We have added the deck to our web store so that you can subscribe to get notified as soon as it is in stock, we will of course post on the blog with more informations when this happens. In the mean time we can share again the video we did for the holidays that was made with 3 Crazyflie 2.1 equipped with the lighthouse deck using 2 V1 base stations: