Author: Kimberly McGuire

Last Wednesday we had our first live tutorial event, explaining our Spiral Swarm Demo that we usually show at conferences. About 60 people signed up and it seems that we have about 40-50 people that were able to join from all parts of the world. There were even several Crazyflie users from Asia that stayed up late especially for this, so we definitely appreciated the dedication!

For those who missed it, you can find the recordings and slides on this event page.

The Tutorial

The first hour we were mostly talking about the Lighthouse positioning system and in particular focusing on the base station V2. In real time, we had hands-on sessions where we actually showed how we setup the system, how to retrieve the calibration data and how to achieve geometry. The hour ended with showing a Crazyflie flying in the lighthouse system itself.

After the break , we focused on how to achieve more autonomy in the swarm, where we talked about the limitation of communication, the high level commander and the app layer. This was also shown with hands-on with multiple flying Crazyflies and the full automatic demo at the end. We were able to keep showing the demo in the end for a 30 minutes more while we were resting up with a drink :)

We were using Discord and Mozilla Hubs simultaneously to stream the tutorial. Discord worked out nicely since we could have one channel for the stream and one channel for the chat, which one of us was able to look at continuously. Mozilla hubs was a nice add-on however it definitely had some hiccups and streaming quality issues, which is not ideal for following a tutorial. Also being in Virtual Reality for 2 hours is very exhausting we heard from headset-using participants.

What next?

We really liked doing the tutorial and speaking one-on-one with our users very much so we are likely to organize one again. Not sure at what frequency though but of course we will announce it first. We have already some requests for topics so we will look into those first. Next time it probably will be a shorter tutorial on Discord only. Mozilla Hubs might still be used but as a virtual gallery where we put 3D visualizations of what we are working on (like how the base station sweeps work for instance), so that people can get a better understanding. If you have any request for topics please leave a comment below.

We will also try out to use our new Discord Server as a digital ‘watering hole’ for our users. Here everybody will have the opportunity to chat with each-other, to share awesome projects and to maybe help each-other out with certain questions. However, we will not be on Discord ourselves all the time and still advise to use forum.bitcraze.io as the main place to ask questions and to seek for support.

Click here to join our Discord server

During the summer we were discusses at the office of what would be a good substitute of us not being able to go to conferences or fairs anymore (see this blogpost). We sparred with a few ideas, ranging from organizing an online competition to an seminar. Although we initially were quite enthusiastic about organizing the competition, the user questionnaire from the previous blog-post showed us that many of you are rather interested in online tutorials. Based on that we actually started to make some more step-by-step guides, however we definitely would agree that is not the same as meeting each-other face-to-face!

So now we are planning to organize one for real this time! So our first online live tutorial will be on:

Wednesday 4th of November, 18:00 (CET, Malmö Sweden)

Register for the first session here to indicate your interest and to receive up-to-date information. There are of course no cost involved!

First topic: Spiraling Swarm Demo (Live!)

The last couple of years we have been showing our demo at many robotics conferences and fairs, such as ICRA, IMAV and IROS. Since we do not have a opportunity to do that anymore (at least for the foreseeable future), we thought that a suitable first topic of the online tutorial to be about the Spiraling Swarm demo! We will go through the different elements of the demo, which includes the implementation details on the Crazyflie and the Lighthouse Positioning system. We hope to explain all of in about 20-30 minutes and that this would enable you to set the demo up yourself if you want.

We have been thinking about just doing a prerecorded tutorial, however we also really like to talk with our users about their needs and research topics. That is why we think it is important to do it live where we can answer your questions on the go or after the tutorial. This also means that we will be demonstrating the demo live as well! Afterwards we will have a social interaction where we have a friendly chat :)

Mozilla Hubs and Discord

There are so many options on how to exactly host this event, as there are a gazillion alternatives for video conferencing. Currently we have are looking at Mozilla hubs. which fits nicely with our interests in the lighthouse positioning system with the HTC Vive basestations. The nice thing aspect of Hubs is that you don’t need a fancy headset to join, since it is possible to join via your browser or your phone. Me (Kimberly) has joined a Virtual Reality seminar at the beginning of the pandemic, organized by Roland Meertens of pinchofintelligence.com, and it was definitely a very interesting and fun experience. When giving a presentation, it really felt like people were paying attention and were engaged. So, we recently recreated our own flight-lab in VR (using Hub’s environment creator Spoke) and tested it out ourselves. This way you will be able to see our workplace as well!

Of course, we can imagine not everybody is waiting to go full VR. That is why we will combine the online tutorial with Discord, where we will make a video channel where we will stream the live demo and tutorial. It will also be possible to send messages that are visible in both the VR space and the Discord chat channel with Hub’s discord bot. You can choose where to follow the tutorial — fully in VR, or first discord and afterwards socialize in VR — that is totally up to you.

We still need to figure out the specifics, but if you register with your email we will send all the necessary information for the first session to you directly.

IOT conference Malmö

Now something else: tomorrow, namely Tuesday the 5th of October, we will also present at the IOT conference 2020 in Malmö. It is free for participants and it is still possible to register! Come and join if you can not wait to see us until the 4th of November.

Now that we are all back from our summer holiday, we are back on what we were set on doing a while ago already: fixing issues and stabilizing code. In the last two weeks we have been focusing on fixing existing issues of the Flowdeck and LPS positioning system. It is still work in progress and even though we fixed some problems, we still have some way to go! At least we can give you an update of our work of the last few weeks.

Flow-deck Kalman Improvements

When we started working on the motion commander tutorials (see this blogpost), which are mostly based on flying with the flowdeck, we were also hit by the following error that probably many of you know: the Crazyflie flies over a low texture area, wobbles, flips and crashes. This won’t happen as long as you are flying of high texture areas (like a children’s play mat for instance), but the occasional situation that it is not, it should not crash like it does now. The expected behavior of the Crazyflie should be that it glides away until it flies over something with sufficient texture again (That is the behavior that you see if when you are flying manually with a controller, and you just let the controls go). So we decided to investigate this further.

First we thought that it might had something to do with the rotation compensation by the gyroscopes, which is part of the measurement model of the flowdeck, since maybe it was overcompensating or something like that. But if you remove that parts, it starts wobbling right away, even with high texture areas… so that was not it for sure… Even though we still think that it causes the actual wobbling itself (compensating flow that is not detected) but we still had to dig a bit deeper into the issue.

Eventually we did a couple of measurements. We let the Crazyflie fly over a low and high texture area while flying an 8 shape and log a couple of important values. These were the detected flow, the ground truth position, and a couple of quality measurements that the Pixart’s PMW3901 flow sensor provided themselves, namely the amount of features (motion.squal) and the automatic shutter time (motion.shutter). With the ground truth position we can transform that to the ground truth flow that the flowdeck is supposed to measure. With that we can see what the standard deviation of the measurement vs groundtruth flow is supposed to be, and see if we can find a relation the error’s STD and the quality values, which resulted in these couple of nice graphs:

Three major improvements were added to the code based on these results:

  • The standard deviation is the flow measurement is increased from 0.25 to 2.0 pixels, since this is actually a more accurate depiction of the measurement noise to be expected by the Kalman filter
  • An adaptive std based on the motion.shutter has been implemented (since there is a stronger correlation there than with motion.squal), which can activated putting the parameter motion.adaptive to True or 1. Its put by default on False or 0 since the heightened STD of the first improvement already increased the quality of flight significantly.
  • If the flow sensor indicates there is no motion detected (log motion.motion), it will now prevent to send any measurement value to the Kalman filter. Also it will adjust the difference in time (dt) between samples based on the last measurement received.

Now when the Crazyflie flies over low texture areas with the Flowdeck alone, it will not flip anymore but simply glide away! Check out this closed issue to know more about the exact implementation and it should be part of the next release.

The LPS and Flowdeck

Kalman filter conflicts

The previous fix of the flow deck also took care of this issue, which caused the Crazyflie to also flip in the LPS system if it does not detect any flow.. This happened because the Kalman filter trusted the Flow measurement much more than the UWB distance measurement in the previous firmware version, but not anymore! If the Flowdeck is out of range or can’t detect motion, the state estimation will trust the LPS system more. However, once the Flowdeck is detecting motion, it will help out with the accuracy of the positioning estimate.

Moreover, now it is possible to make the Crazyflie fly in and out of the LPS system area with the Flowdeck! however, be sure that it flies using velocity commands, since there are situations where the position estimate can skip:

  • 1- LPS system is off 2- take off Crazyflie with only Flowdeck, 3 – turn on the LPS nodes
  • 1- Take off in LPS, 2- fly out of LPS system’s reach for a while (position estimate will drift a bit) 3- Fly back into the LPS system with position estimation drift due to Flowdeck.

As long as your are flying with velocity commands, like with the assist modes with the controller in the CFclient, this should not be a problem.

Deck compatibility problems

The previous fixes only work with the LPS methods TDOA2 and TDOA3. Unfortunately, there is still some work to be done with the Deck incompatibility with the TWR method and the Flowdeck. The deck stops working quickly after the Crazyflie is turned on and this seems to be related to the SPI bus that is shared by the LPS deck and the flowdeck. Reading the flow sensor takes some time, which blocks the TWR algorithm for a while, making it miss an event. Since the TWR algorithm relies on a continues stream of events from the DWM1000 chip, it simply stops working if it does not … or at least that is our current theory …

Please check out this issue to follow the ongoing discussion. If you have maybe an idea of what is going on, drop a comment and see if we can work together to iron out this issue once and for all!

As you probably already know we have been wondering how to best handle our documentation and how to provide information to new Crazyflie starters as easy as possible, as you can read in our blogpost of two weeks ago. In the mean time, we also had a chance to think about the results of a poll we had when we discussed about new ways on how to meet our users. We had about 30 responses, but it became clear that many of you are in the need of getting some more knowledge about working with the Crazyflie. The majority voted for online tutorials, and although it might be difficult to do those during the summer holidays, we already started to make step-by-step guides of various parts of the Crazyflie Eco-system.

Poll result of alternative events.

Python Library Tutorials

Currently we have started with step-by-step tutorials of the CFLIB (the python library of the crazyflie). Usually we refer to the example pages of the CFLIB, however we feel that many users copy paste parts of these scripts for their own purposes, without understanding what is actually going on. Therefore, in order to move Crazyflie beginners to the starting developer phase, we have made these guides in order to teach exactly what is going on in each module, step-by-step.

These tutorials can be found in the python library documentation. The first tutorial focuses on connecting, logging and parameters, which guides you through the process of connecting the Crazyflie through a python script, starting up logging configurations in two ways (asynchronous and synchronous) and how to read and set parameters.

The second tutorial is about the motion commander and is a logical continuation of the first. A nice thing is that we also show how to build in some protection in your script as well. If the Flowdeck is not attached, you can check that and prevent the Crazyflie from taking off altogether if it does not detect the Flowdeck. This will be a life saver in your future endeavors, and trust me I know from experience ;). Afterwards it will go into how to take off – fly forward and go back to the initial position. The application of the end of the motion commander tutorial, is where we also use the logging functionality to get the actual estimated position of the Crazyflie. With this information, we show how to write an application that create a virtual bounding box where the Crazyflie can bounce around in (like the old windows screensaver).

The results of the motion commander step by step guide.

We are planning to finish this step-by-step guide by adding the multi-ranger to the mix, continuing on the bouncing ball example. After that we will probably start some tutorials on how to use the swarming functionality before moving on to the firmware or the client.

Work in Progress

The tutorials are still work in progress. So let us know on the forum, python library github repository or as a comment on this blogpost if you see anything wrong or if something is not very clear. This will improve the quality further so that other users can benefit as well. Also once the these step-by-step tutorials are finished we can start working on video based tutorials as well.

Remember, it is possible to contribute your own fixes (or tutorials) to our repositories if you want to. It’s an open source project after all ;)

It is apparently a recurrent theme within Bitcraze:New people come into the office, claim that the documentation is a bit of a mess, then will make it their personal mission on the company to try to fix it (because ‘how hard can it be?’) and come close to a mini depression when it turns out that it ain’t so easy at all.

And yes, I absolutely fell in into that trap too. During my PhD I did not really work on documentation like this (with the exception of papers) so I made quite ambitious plans last year as you can read in this blogpost. We managed to already cross a couple of things off: we moved wiki pages to the github and host it on our website and created datasheets for products, which should make it possible to close the wiki product pages.

However, we still haven not managed to completely close off the wiki because some pages can not really be split up or might have information on there that might not be very future proof. But there are definitely many matters to improve, so we are just writing some of our thoughts down.

Beginner – ? – Developer

One of the things that we noticed that is missing, also by comments of you guys on the forum or by mail, are the means to bring the crazyflie starters quickly to the developer phase. There are some tutorials to be found on our website, but the general feeling is that it does not elevate the general understanding of how everything works. Even the tutorials that cover the autonomous flight with a flowdeck does not go further than giving install instructions and handing over the full python script while not explaining which element does what.

Of course, there are already user manuals to be found in the github docs, however those are maybe too big of a step and take much for granted that the reader knows every ‘in between’ step. It would be much better, for any level, to have step-by-step guides on how to set thing up and what each element’s role is in the code. That would probably work much more effectively as a start for beginning developers.

So we had some tutorials in mind that can elevate first starters to come closer to the developer page:

  • CFCLIENT: How to working with the logging / parameter framework and the plotting tab
  • CFCLIENT: How to interpret the debug console output
  • CFLIB: How to connect to the Crazyflie and read out logs and parameters
  • CFLIB: How to send set points and the commander framework
  • CFLIB: How to build up the Multiranger push demo step by step
  • CF FIRMWARE: How to work with the Applayer (adding own modules or code)

If there are more tutorials that you would like to see, please let us know!

Doc closer to the code

The consensus here in general, is that we would like to have the documentation as close as possible to the doc. At least we have taken a step into the right direction by importing the docs into the Github Repos. This means that with every new feature added, the person responsible can add documentation to it directly in the same commit/pull request.

However, if the description is part of function’s or classes doc strings, it is as close as it can get! The contributor does not need to change the separate markdown file but can change the information directly. Moreover, it can also auto generate documentation for us, as you can see here from one of our try-outs with sphinx and our crazyflie-lib-python repo:

Part of the class and module overview of CFLIB with some auto doc of the high level commander.

Maybe for a beginner such documentation would not be great as a start, but for a more trained developer this could be very useful. My personal problem with most automatic generated documentation is that I find them difficult to read and find the functions that I need. However it would be possible to change the layout to make it a bit more readable since we will host it on our website. And since we mostly use C and Python in our repos, the most logical tools would be Doxygen and Sphinx. There are probably other possibilities out there, but if we would like to integrate this in our framework, we would like to go with tools that are future proof.

The whole picture

The problem with Autodoc is that it mostly shows the itty-bitty-gritty details of a library or firmware, however the users tend to get lost and can not see the whole picture. Also we are maintaining a lot of libraries and firmwares to consider (as you can see here in this list) based on which hardware they are applied for. This means that we have separate documentation pages on almost all of those.

And then comes the decision of where to place information. For instance, the CRTP (Crazy RealTime Protocol) is documented in the crazyflie firmware documentation, since there it is indeed how it is implemented, but CRTP does not only affect the crazyflie firmware. It goes from the STM32F4 to the NRF to the crazyradio through the USB on your computer through the cflib which is the backbone of the CFclient. This is an topic that users would like to have to an overview from if they would like to develop something with the CRTP.

Step-by-step guides are maybe still too detailed to explain the whole picture so maybe we should have some other way to have this overview shown. Maybe by an online lecture or a more lesson type of medium?

There is a lot of sources out there (like write the doc) that have tips on how to maintain the information sources for users, but of course we need to have a documentation structure that is useful and readable for many types of users and maintainable from our side. Let us know if you want to share any insight from your own experiences!

It has been about a month since the AI-deck became available in Early Access. Since then there are now quite a few of you that own an AI-deck yourself. A new development we would like to share: we thought before that we had selected a gray-scale image sensor. However, it came to our attention that the camera actually contains a color image sensor, which on second viewing of the video presented in this blogpost is pretty obvious in hindsight (thanks PULP project ETH Zurich for letting us know!).

A color image from the AI-deck

This came as a little surprise, but a color camera can also add some new possibilities, like making the Crazyflie follow a orange ball, or also train the CNNs incorporate color in their classification training as well. The only thing is that it will require an extra preprocessing task in order to retrieve the color image, which will be explained in the next section.

Demosaicing

Essentially all CMOS image sensors are gray-scale by definition. In order to retrieve color from a scene, manufacturers add a Bayer filter on top of the image sensor, so it filters out the red, green and blue on each pixels. This Color filter array does not need to be RGB, but all kinds of colors, but we will only talk about the Bayer filter. If the pattern of the filter is known, the pixels that related to a certain color will be interpolated with each-other in order to fill in the gaps in between. This process is called demosaicing and it creates the RGB channels that are converted to a color image.

Process of demosaicing with a Bayer filter

Currently we only implemented a simple nearest-neighbor interpolation scheme for demosaicing, which is fine for demonstration purposes, however is not the best technique out there. Such a simple interpolation is not very ‘edge and detail’ aware and can therefore cause artifacts, like these Moiré effects seen here below. Anyway, we are still experimenting how to get a better image and how to translate that to all the examples of the AI-deck example repository (see this issue if you would like to follow or take part in the discussion).

Moiré effect

So technically, once we have the color image, this can be converted to a gray-scale images which can be used for the examples as is. However, there is a reduction in quality since the full pixel resolution is not used for obtaining the full scale image. We are currently discussing if it would be useful to get the gray-scale version of this camera and make this available as well, so let us know if you would be interested!

Feedback and Early Access

Like we said before, there now quite a few of you out there that have an AI-deck in their procession. As it is in Early Access, the software part is still in full development. However, since we have not received any negative feedback of you, we believe that everything is fine and peachy!

Just kidding ;) we know that the AI-deck is quite a challenging deck to work with and we know for sure that many of you probably have questions or have something to say about working with it. Buying an Early Access product also comes with a little bit of responsibility. The more feedback we get from you guys, the more we can tailor the software and support to help you and others, thereby advancing the product forward and getting it out of the early access phase.

So please, let us know if you are having any trouble starting up by posting a thread on the forum (we have a special AI-deck group!). If there are any issues with the examples or the documentation of the AI-deck repo. We and also our collaborators at Greenwaves Technologies (from the GAP8 chip) are more than happy to help out. That is what we are here for :)

It has been a few months of when the Covid-19 crisis started, but it feels like almost a year ago when we all decided to stay and work from home. Considering circumstances, we managed to do to handle ourselves pretty well. We set up our home labs in our kitchens and/or living-rooms and managed to do a lot of development. Even though this situation did not come easy as you can see from our experiences here, we were able to pull ourselves through it in one peace. Now we also have to consider that Covid-19 is here to stay and we need to deal with the complications until at least the vaccine is finished and distributed. Until then, we might have to think about alternatives on how we do things, including how we go to events and meet/talk to you all!

Every year we try to go to at least two conferences, with last year being a particular busy year of us going to three big events (ICRA 2019, IMAV 2019 and IROS 2019). Before going to those conferences, we usually try to crunch and make an awesome demo. This also enables us to add new features to the firmware or fix problems that we find during this crunch. Moreover, we also really like to meet our users face-to-face, so that we can hear about how you use the Crazyflie in your research or classroom!

Since going to conferences and in-person events will be difficult to do this year and maybe the next, we were thinking about events that we can possibly organize to as an alternative. We were thinking about a couple of options on which we would like your opinion on as well. For instance, we could do an remote tutorial or lecture, like we did here for EPFL. Or maybe we can organize an online seminar we were invite users to give a talk about their work (I personally took part in a VR seminar in Mozilla hubs, which was pretty awesome). We can also consider to invite users for an online meetup to talk about the direction of the Crazyflie and its firmware. Another idea that we had recently, is to organize an online Crazyflie competition, where users can control the Crazyflie remotely or upload custom firmware, so that it can fly autonomously through an obstacle field.

We set up a poll of these ideas, so we can know what you guys like best! Also please comment below if you have further ideas about this or start a thread on the forum!



The AI-deck is now available in our online store! Super-edge-computing is now possible on your Crazyflie thanks to the GAP8 IoT application processor from GreenWaves Technologies. GAP8 delivers over 10 GOPS of compute power at exceptionally low power consumption enabling complex tasks such as path-finding and target following on the Crazyflie, consuming less than 0,1% of the total energy.

The AI-deck can host artificial intelligence-based workloads like Convolutional Neural Networks onboard. This will open up many research areas focusing on fully onboard autonomous navigation of tiny MAVs, like ETH Zurich’s PULP-Dronet. Moreover, there is also ESP32 WiFi connectivity with the possibility to stream the images to your personal computer.

We are happy that we managed to get everything ready so soon after our last update. Crazyflie AI-deck is in early access, which means the hardware design is now finalized and full support for building, running and debugging applications (including GreenWaves’ GAPflow tools for porting neural networks to GAP from TensorFlow) is available, however, limited examples of specific AI-deck applications have been developed so far. Read more about early access here. Even though there is still some work to be done, there are already some examples you can try out which we will explain in this blog post. Also, we aim to have all AI-decks pre-flashed with the WiFi streamer example so that you check out right away if your AI-deck is working.

Beware that you need an JTAG-enabled programmer/debugger in order to develop for the AI-deck!

Technical specifications

The AI-deck will come with two elongated male pin-headers, which enables the user to connect it to the Crazyflie with an additional deck. There are two 10 pin JTAG connectors soldered which enables connection with a JTAG-enabled programmer. This will be the main way to program the GAP8 chip and the ESP-based NINA module while it is still in early access.

Getting started

When you first receive your AI-deck, it should be flashed with a WiFi streamer example of the camera image stream. Once the AI-deck is powered up by the Crazyflie, it will automatically create a hotspot called ‘Bitcraze AI-deck Example’. In this repo in the folder named ‘NINA’ you will find a file called viewer.py. If you run this with python (preferably version 3), you will be able to see the camera image stream on your computer. This will confirm that your AI-deck is working.

Next step is to go to the docs folder of the AI-deck examples repository. Try out the WiFi demo and set up your development program with the getting-started guide. This guide contains links to the GAP-SDK documentation from GreenWaves Technologies. You can read more about the face detector example that we demonstrated in this blog post.

It has been a while since we have updated you all on the AI deck. The last full blogpost was in October, with some small updates here and there. It is not that we have not focused on it at all; on the contrary… this has been a high priority project for a while now. It is just quite a complex board with a lot of bells and whistles, which can be challenging to work with sometimes so early in development, something that our previous intern can definitely agree on. So therefore we rather wanted to wait until we were able to make sufficient progress before we gave you an update… and so we have!

A Crazyflie 2.1 with the AI deck

Together with Greenwaves technologies we have been trying to get the SDK of the GAP8 chip on the AI deck stable enough for an early release. The latest release of the SDK (version 3.4) has proved itself to work with relative ease on the AI deck after extensive testing. Currently it is possible to use OpenOCD for flashing and debugging, and it supports most commonly available debuggers with a jtag connector. In the upcoming weeks both of Bitcraze and Greenwaves will test and try out all examples of the SDK on the AI deck to make sure that everything is still compatible. Also the documentation will be extended as well. As there is so much to document, it might be difficult to catch all of it. However, if you notify us and Greenwaves on anything that is missing once the AIdeck is out, that will help us out to catch the knowledge gaps.

The AI deck also contains the ESP-based NINA module for establishing a WiFi connection. This enables the users to stream the video stream of the AI deck onto their computers, which will be quite an essential tool if they would like to generate their own image database for training the CNNs for the GAP8 (and it happens to also be quite practical for debugging by the way!). Currently it is required to set credentials of your local WiFi network and reflash the AI-deck to be able to connect and streaming the images, but we are working on turning the Nina into an access-point instead so no reflashing would be required. We hope that we will be able to implement this before the release.

Top view of the AI deck

We are also trying out to adjust applications to make suitable of the AI deck. For instance, we have adapted Greenwaves’ face-detector example to use the image streamer instead of the display available on the GAPuino boards. You can see a video of the result here underneath. Beware that this face-detector is not based on a CNN but on HOG descriptors, so it only works in good conditions where the face is well lit. However, it is possible to train a CNN to detect faces in Tensorflow and flash this on the AI deck with the GAPflow framework as developed by Greenwaves. At Bitcraze we haven’t managed to try that out ourselves ( we are close to that though!) but at least this example is a nice demonstration of the AI deck’s abilities together with the WiFi-streamer. This example and more testing code can be found in our experimental repo here. For examples of GAPflow, please check out the examples/NNtool section of the GAP8 SDK.

For some reason WordPress has difficulty embedding the video that was supposed to be here, so please check https://youtu.be/0sHh2V6Cq-Q

Seeing how the development has been progressing, we will be comfortable to say that the AI deck could be ready for early release somewhere in the next month, so please keep an eye out on our website! We will continue to test the GAP SDK’s stability and we are very thankful for Greenwaves Technologies with their help so far. We will also work on getting-started guides in order to get acquainted with the AI deck, supplementing the already existing documentation about the GAP8 chip.

Even-though the AI deck will soon be ready for early release, this piece of hardware is not for the faint-hearted and embedded programming experience is a must. But keep in mind that the possibilities with the AI deck are huge, as it will be mean that super-edge-computing on a 30 gram flying platform will be available for anyone. It will all be worth it when you have your Crazyflie flying autonomously while being able to recognize its surroundings :)

Here is another blog post where we try to explain parts of the stabilizer framework of the Crazyflie. Last time, we talked about the controllers and state estimators as part of the stabilizer.c module which was introduced in this blog post back in 2016. Today we will go into the commander framework, which handles the setpoint of the desired states, which the controllers will try to steer the estimated state to.

The Commander module

The commander module handles the incoming setpoints from several sources (src/modules/src/commander.c in the firmware). A setpoint can be set directly, either through a python script using the cflib/ cfclient or the app layer (blue pathways in the figure), or by the high-level commander module (purple pathway). The High-level commander in turn, can be controlled remotely from the python library or from inside the Crazyflie.

General framework of the stabilization structure of the crazyflie with setpoint handling. * This part is takes place on the computer through the CFlib for python, so there is also communication protocol in between. It is left out of this schematics for easier understanding.

It is important to realize that the commander module also checks how long ago a setpoint has been received. If it has been a little while (defined by threshold COMMANDER_WDT_TIMEOUT_STABILIZE in commander.c), it will set the attitude angles to 0 on order to keep the Crazyflie stabilized. If this takes longer than COMMANDER_WDT_TIMEOUT_SHUTDOWN, a null setpoint will be given which will result in the Crazyflie shutting down its motors and fall from the sky. This won’t happen if you are using the high level commander.

Setpoint structure

In order to understand the commander module, you must be able to comprehend the setpoint structure. The specific implementation can be found in src/modules/interface/stabilizer_types.h as setpoint_t in the Crazyflie firmware.

There are 2 levels to control, which is:

  • Position (X, Y, Z)
  • Attitude (pitch, roll, yaw or in quaternions)

These can be controlled in different modes, namely:

  • Absolute mode (modeAbs)
  • Velocity mode (modeVelocity)
  • Disabled (modeDisable)
Setpoint structures per controller level

So if absolute position control is desired (go to point (1,0,1) in x,y,z), the controller will obey values given setpoint.position.xyz if setpoint.mode.xyz is set to modeAbs. If you rather want to control velocity (go 0.5 m/s in the x-direction), the controller will listen to the values given in setpoint.velocity.xyz if setpoint.mode.xyz is set to modeVel. All the attitude setpoint modes will be set then to disabled (modeDisabled). If only the attitude should be controlled, then all the position modes are set to modeDisabled. This happens for instance when you are controlling the crazyflie with a controller through the cfclient in attitude mode.

High level commander

Structure of the high level commander

As already explained before: The high level commander handles the setpoints from within the firmware based on a predefined trajectory. This was merged as part of the Crazyswarm project of the USC ACT lab (see this blogpost). The high-level commander uses a planner to generate smooth trajectories based on actions like ‘take off’, ‘go to’ or ‘land’ with 7th order polynomials. The planner generates a group of setpoints, which will be handled by the High level commander and send one by one to the commander framework.

It is also possible to upload your own custom trajectory to the memory of the Crazyflie, which you can try out with the script examples/autonomous_sequence_high_level of.py the crazyflie python library repository. Please see this blogpost to learn more.

Support in the python lib (CFLib)

There are four main ways to interact with the commander framework from the python library.

  1. Send setpoints directly using the Commander class from the Crazyflie object, this can be seen in the autonomousSequence.py example for instance.
  2. Use the MotionCommander class, as in motion_commander_demo.py. The MotionCommander class exposes a simplified API and sends velocity setpoints continuously based on the methods called.
  3. Use the high level commander directly using the HighLevelCommander class on the Crazyflie object, see autonomous_sequence_high_level.py.
  4. Use the PositionHlCommander class for a simplified API to send commands to the high level commander, see the position_commander_demo.py

Documentation

We are busy documenting the stabilizer framework in the Crazyflie firmware documentation, including the content of this blogpost. If you feel that anything is missing or not explaining clearly enough about the stabilizer framework, please drop a comment below or comment on the forum.