Category: AI-deck

A big part of our work is to provide examples, getting-started guides and other documentation to get users started quickly. Documentation should be up-to-date, be understandable and detailed but, at the same time, not overwhelming. Examples should cover common applications and, most importantly, teach how to create your own projects. This is a never-ending task as our eco-system constantly evolves.

In recent weeks we have updated many parts of the AI-deck documentation and examples – this process is not finished (and will never be), but we thought giving you an overview about what we think most struggle with as well as what we updated would be interesting – especially as we see many AI-deck related questions in the discussions.
We saw that many struggle with understanding the whole communication chain and the importance to update all microcontrollers in it – so we will first give an overview on how everything is connected and then dive into where to find documentation, which examples already exist and how to get started with an own project on GAP8. Note that this post is centered around the GAP8; we do not go into detail about the NINA WiFi.

Here we go:

How does the AI-deck fit into the Crazyflie Eco-System? How does it communicate?

As with all other decks, the AI-deck is connected over expansion headers. It gets power directly from the battery (VCOM), and both microcontrollers (GAP8 and the NINA WiFi module) are connected to the STM32 via UART.

AI-deck expansion header connections.

To send messages between all those microcontrollers, the CPX protocol was introduced. As the NINA and GAP8 are also connected (via SPI), we have redundant information paths – so per definition, we always route over the NINA.

Now how can we send code to the GAP8 for it to run?

GAP8 always executes code from L2 (second-level RAM), as it has no internal flash. However, it can load code into L2 over a HyperBus interface from external flash memory on startup (which it does if a fuse is blown, however this is already done on your AI-deck and out of scope here). As GAP8 has only volatile memory, it must always load code from exactly the same flash address. To make it possible to update applications easily, we implemented a bootloader, a minimal program which is the first thing to run on startup. The bootloader can either update the application code in flash or copy it into L2, and, if the code is valid, run it. Why is this easier? First, you don’t need to connect a programmer, as the bootloader can read data over other peripherals (in our case SPI from the NINA module). Second, it is safer – if the update fails (and you, for some reason, end up with random code where your application should be) the firmware code will not be valid (the hash computation will fail) and GAP8 will not jump to the corrupt application code but instead safely stay in the bootloader.

As the chain for the over-the-air update with the bootloader is rather complex, we illustrate the ways to flash GAP8 in the image below.

  • the blue path illustrates how you can program over JTAG – you can either write code directly into L2 to run it (this is volatile memory, the code will disappear if you power cycle) or you can write it into flash (over GAP8), such that it is loaded on startup (if you overwrite the bootloader, not recommended) or with the bootloader.
  • the red path is using the cfloader. Meaning it sends your code over the Crazyradio to the nRF, then further to the STM32, from there to the ESP32 (the NINA WiFi module) and from there to the GAP8. This path uses CPX messages; you can read more about it in the CPX documentation.
System Architecture of a Crazyflie with an AI-deck connected.

Where is the documentation?

We have tutorials as well as repository documentation. Tutorials guide you through all the steps needed to run a specific example, while the repository documentation aims to document the general infrastructure and examples in more detail (but without all not directly related steps such as flashing a bootloader, updating other firmware, etc.).

So when you use your AI-deck for the first time, you should start with the getting started guide.
Then you are most likely interested in a more detailed explanation about the used infrastructure, such as the GAP8 including the SDK, how to flash, which examples exist and how to run them, etc. So now you should check out the repository documentation.

As we are mostly speaking about GAP8 here, we should also mention that there is of course also documentation for it outside of Bitcraze. GAP8 is produced by Greenwaves, who provides references and has a public SDK on github – meaning one can actually look up the code for all drivers, look at open issues or even contribute with pull-requests.

Which examples exist? What are they there for? What did we update?

This example is there to get you started with your own applications – it provides a minimal implementation of how to send something to the cfclient console from the GAP8 and is explained in detail in the next section of this post.

The camera test is, as it says, to test the camera – however, it sends the image over JTAG, so if you don’t have a debugger and/or don’t want to overwrite the bootloader this is not an example for you.

The most classic example – this example is used in the getting started guide and streams video over WiFi.

This example uses filters to find faces in images – be a bit careful, though, as it is very sensitive to noisy backgrounds and, for example, blonde hair. However, along with nice image processing examples, it now also implements the streaming of the images in configurable resolution, a fun feature we recently added!

The classification demo is our AI demo which recognizes parcels. Here we recently fixed the CPX initialization, so it can again send results to the console in the cfclient!

The send character over UART example was neither updated nor tested with the newest docker (yet).

How do I write my own code on gap8?


Now we’ll walk you through a minimal example of how to send Hello World from GAP9 to the cfclient console.

C Code

We start with the main file (which we called hello_world_gap8.c and is found here) by including some dependencies:

#include "pmsis.h" for the drivers

#include "bsp/bsp.h" for some configuration parameters (pad configurations for connecting to memory, camera, etc.)

#include "cpx.h" for using the CPX functions to send our hello world to the console

Then we have to write our main function:

int main(void)
{
  return pmsis_kickoff((void *)start_example);
}Code language: JavaScript (javascript)

We call pmsis_kickoff() to start the scheduler and an event kernel, giving it a pointer to the function we want to execute.
This function is what we write next (insert it above the main function, such that it is found in the code of the main).

void start_example(void)
{
  pi_bsp_init();
  cpxInit();

  while (1)
  {
      cpxPrintToConsole(LOG_TO_CRTP, "Hello World\n");
      pi_time_wait_us(1000*1000);
  }

}Code language: JavaScript (javascript)

First we need to initialize the pads according to our configuration (the configuration is automatically chosen with sourcing the ai_deck.sh, which is automatically done in the docker) with pi_bsp_init().
Then we need to initialize CPX (which initializes for example the SPI connection to the NINA WiFi), to be able to send CPX packets. You find more information in the CPX documentation.
Now we are ready for our while loop, in which we want to send “Hello World” to the console (called LOG_TO_CRTP). To not keep the busses overly busy we only want to send it every second, so we wait before we repeat.

Makefile

The Makefile is hierarchical – meaning we have hidden files that do most of the work and need to include $(RULES_DIR)/pmsis_rules.mk in the last line of the Makefile.

We start with defining where the io should go – possibly are host or uart (this is actually not used in this example, but if you’d add a printf, this would define where it goes).
Then we define the operating system we want to use – we can use pulpos or freertos. We chose this as freertos is way more advanced (we are paying for this with some overhead, but in most cases, it will be worth it).

io=uart
PMSIS_OS = freertos

In the next step we need to set the name of our application (this defines the file names of the build output), include the sources (meaning our main c file as well as the two c files CPX needs) and include the header files directory (header files in the same directory as the Makefile should automatically found, but our CPX header files are in a library directory). Make sure all the relative paths are correct for your folder structure.

APP = hello_world_gap8
APP_SRCS += hello_world_gap8.c ../../../lib/cpx/src/com.c ../../../lib/cpx/src/cpx.c
APP_INC=../../../lib/cpx/inc

As a last step, we want to set some compiler flags. Firstly, we want to compile optimized, so we add -O3. Then we add -g to embed debug information.
As we use timers for CPX we also need to add two additional defines to ensure all functions we need are included: the configUSE_TIMERS=1 and the INCLUDE_xTimerPendFunctionCall=1 defines.

APP_CFLAGS += -O3 -g
APP_CFLAGS += -DconfigUSE_TIMERS=1 -DINCLUDE_xTimerPendFunctionCall=1

Compile and Flash


For this section we assume you are in the ai-deck-examples directory.

Now we want to compile using docker:

docker run --rm -v ${PWD}:/module --privileged bitcraze/aideck tools/build/make-example SRC_DIR clean allCode language: JavaScript (javascript)

Where SRC_DIR is the location of your code, here it is examples/other/hello_world_gap8.

Note: You don’t always need to “clean” if you don’t modify the Makefile. It should be fine to save some time by skipping this.

And finally we want to flash using the OTA updater:

cfloader flash [binary] deck-bcAI:gap8-fw -w CRAZYFLIE_URICode language: CSS (css)

Where in this example, [binary] has to be replaced with examples/other/hello_world_gap8/BUILD/GAP8_V2/GCC_RISCV_FREERTOS/target.board.devices.flash.img, and the CRAZYFLIE_URI is something like radio://0/80/2M/E7E7E7E7E7.

Now you can connect to your drone with the cfclient and should see a CPX: GAP8: Hello World print every second.

Note: The LED will not blink as in most other examples, as we did not implement a task which does this.

We hope this blog post helps you get started with your own awesome applications faster!

My name is Hanna, and I just started as an intern at Bitcraze. However, it is not my first time working with a drone or even the Crazyflie, so I’ll tell you a bit about how I ended up here.

The first time I used a drone, actually even a Crazyflie, was in a semester thesis at ETH Zurich in 2017, where my task was to extend a Crazyflie with a Parallel Ultra Low-Power (PULP) System-on-Chip (SoC) connected to a camera and external memory. This was the first prototype of the AI-deck you can buy here nowadays (as used here) :)

My next drone adventure was an internship at a company building tethered drones for firefighters – a much bigger system than the Crazyflie. I was in charge of the update system, so more on the firmware side this time. It was a very interesting experience, but I swore never to build a system with more than three microcontrollers in it again.

This and a liking for tiny and restricted embedded systems brought me back to the smaller drones again. I did my master thesis back at ETH about developing a PULP-based nano-drone (nano-drones are just tiny drones that fit approximately in the palm of your hand and use only around 10Watts of power, the category Crazyflies fit in) and some onboard intelligence for it. As a starting point, we used the Crazyflie, both for the hardware and the software. It turned out to be a very hard task to port the firmware to a processor with only a very basic operating system at that time. Still, eventually I knew almost every last detail of the Crazyflie firmware, and it actually flew.

However, for this to happen, I needed some more time than the master thesis – in the meantime, I started to pursue a PhD at ETH Zurich. I am working towards autonomous miniaturized drones – so besides the part with the tiny PULP-based drone I already told you about, I also work on the “autonomous” part. Contrary to many other labs our focus is not only on novel algorithms though, we also work with novel sensors and processors. Two very interesting recent developments for us are a multi-zone Time-of-Flight sensor and the novel gap9 processor, which both fit on a Crazyflie in terms of power, size and weight. This enables new possibilities in obstacle avoidance, localization, mapping and many more. Last year my colleagues and I already posted a blog post about our newest advances in obstacle avoidance (here, with Videos!). More recently, we worked on onboard localization, using novel multi-zone Time-of-Flight sensors and the very new GAP9 processor to execute Monte Carlo localization onboard a Crazyflie (arxiv).

On the left you see an example of a multi-zone Time-of-Flight image (the background is a picture from the AI-deck), from here. On the right you see our prototype for localization in action – from our DATE23 paper (arxiv).

For me, localizing with a given map is a fascinating topic and one of the reasons I ended up in Sweden. It is one of the most basic skills of robots or even humans to navigate from A to B as fast as possible, and the basis of my favourite sport. The sport is called “orienteering” and is about running as fast as possible to some checkpoints on a map, usually through a forest. It is a very common sport in Sweden, which is the reason I started learning Swedish some years ago. So when the opportunity to go to Malmö for some months to join Bitcraze presented itself, I was happy to take it – not only because I like the company philosophy, but also because I just like to run around in Swedish forests :)

Now I am looking forward to my time here, I hope to learn lots about drones, firmware, new sensors, production, testing, company organization and to meet a lot of new nice people!

Greetings from Malmö – it can be a bit cold and rainy, but the sea and landscape are beautiful!

Hanna

This week’s guest blogpost is from Rik Bouwmeester from the Micro Air Vehicle lab, Faculty of Aerospace Engineering at the Delft University of Technology.

Tiny quadcopters like the Crazyflie can be operated in narrow, cluttered environments and in proximity to humans, making them the perfect candidate for search-and-rescue operations, monitoring of crop in a greenhouse, or performing inspections where other flying robots cannot reach. All these applications benefit from autonomy, allowing deployment without proximity to a base station or human operator and permitting swarming behavior.

Achieving autonomous navigation on nano quadcopters is challenging given the highly constrained payload and computational power of the platform. Most attention has been given to monocular solutions; the camera is a lightweight and energy-efficient passive sensor that captures rich information of the environment. One of the most important monocular visual cues is optical flow, which has been exploited on MAVs with higher payload for obstacle avoidance [1], depth estimation [2] and several bio-inspired methods for autonomous navigation [3–7].

Optical flow describes the apparent visual variations caused by relative motion between an observer and their surroundings. This rich visual cue contains tangled information of velocity and depth. However, calculating optical flow is expensive. The field of optical flow estimation is and has been for a couple of years dominated by convolutional neutral networks (CNNs). Despite efforts to find architectures of reduced size and latency [8-10], these methods are still highly computationally expensive, running at several to tens of FPS on modern desktop GPUs and requiring millions of parameters to run, rendering them incompatible with edge hardware.

To this end, we present “NanoFlowNet: Real-Time Dense Optical Flow on a Nano Quadcopter”, submitted to an international robotics conference, which introduces NanoFlowNet, a CNN architecture designed for real-time, fully on-board, dense optical flow estimation on the AI-deck.

CNN architecture

We adopt semantic segmentation CNN STDC-Seg [11] and modify it for optical flow estimation. The resulting CNN architecture may be considered “real-time” on desktop hardware, for deployment on edge devices such as a nano quadcopter the net must be significantly shrunk. We improve the latency of the architecture in three ways.

First, we redesign the key convolutional modules of the architecture, the Short-Term Dense Concatenate (STDC) module. By reordering the operations within the strided variant of the module, we save, depending on the location of the module within the architecture, from over 10% to over 50% of the MAC operations per module, while increasing the number of output filters with large receptive field size. A large receptive field size is desirable for optical flow estimation.

Second, inspired by MobileNets [12], we globally replace ‘regular’ convolutions with depthwise separable convolutions. Depthwise separable convolutions factorize a convolution into a depthwise and pointwise convolution, effectively reducing the calculational expense at a cost in representational capacity.

Third, we reduce the input dimensionality. We train and infer network on grayscale input images, reducing the required on-board memory for storing images by a factor 2/3. Any memory saved on the AI-deck’s L2 memory can be handed to AutoTiler for storing the CNN architecture, speeding up the on-board execution. Requiring more of a speed-up, we run the CNN on-board at a reduced input resolution of 160×112 pixels. Besides the speed-up through saved L2, reducing the input resolution makes all operations throughout the network cheaper. We downscale training data to closely match the target resolution. Both these changes come at a loss of input information. We will miss out on small objects and small displacements that are not captured by the resolution.

To give some intuition of the available memory: Estimating optical flow requires two input images. Storing two color input images at full resolution requires (2 x 324x324x3=) 630 kB. The AI-deck has 512 kB of L2 memory available.

Motion boundary detail guidance

Inspired by STDC-Seg, we guide the training of optical flow with a train-time-only auxiliary task to promote the encoding of spatial information in the early layers. Specifically, we introduce a motion boundary prediction task to the net. The motion boundary ground truth can be found in the optical flow datasets. This improves performance by 0.5 EPE on the MPI Sintel clean (train) benchmark, at zero cost to inference latency.

Performance on MPI Sintel

Given the scaling and conversion to grayscale of input data, our network is not directly comparable with results reported by other works. For comparison, we retrain one of the fastest networks in literature, Flownet2-s [13], on the same data. Given the reduction in resolution, we drop the deepest two layers to maintain a reasonable feature size. We name the model Flownet2-xs.

We benchmark the performance of the architecture on the optical flow dataset MPI Sintel. NanoFlowNet performs better than FlowNet2-xs, despite using less than 10% of the parameters. NanoFlowNet achieves 5.57 FPS on the AI-deck. FlowNet2-xs does not fit on the AI-deck due to the network size. To put the achieved latency of NanoFlowNet in perspective, we execute FlowNet2-xs’ first two convolutions and the final prediction layer on the GAP8. The three-layer architecture achieves 4.96 FPS, which is slower than running the entire NanoFlowNet. On a laptop GPU, the two architectures accomplish similar latency.

MethodMPI Sintel (train) [EPE]Frame rate [FPS]Parameters
CleanFinalGPUGAP8
FlowNet2-xs9.0549.4581501,978,250
NanoFlowNet7.1227.9791415.57170,881
Performance on MPI Sintel (train subset). (Average) end-to-end Point Error (EPE) describes how far off the estimated flow vectors are on average, lower is better.

Obstacle avoidance implementation

We demonstrate the effectiveness of NanoFlowNet by implementing it in a simple, proof-of-concept obstacle avoidance application on an AI-deck equipped Crazyflie. We let the quadcopter fly forward at constant velocity and implement the horizontal balance strategy [14], [15], where the quadcopter balances the optical flow in the left and right half plane by yawing.

We equip a Crazyflie with the Flow deck for positioning only. The total flight platform weighs 34 grams.

We augment the balance strategy by implementing active oscillations (a cyclic up-down movement), resulting in additional optical flow generated across the field of view. This is particularly helpful for avoiding obstacles in the direction of horizontal travel, since no optical flow is generated at the focus of expansion.

The obstacle avoidance implementation is demonstrated in an open and a cluttered environment in ‘the Cyber Zoo’, an indoor flight arena at the faculty of Aerospace Engineering at the Delft University of Technology. The control algorithm is most robust in the open environment, with the quadcopter managing to drain a full battery without crashing. In the cluttered environment, performance is more variable. Especially on occasions where obstacles are close to one another, the quadcopter tends to avoid the first obstacle successfully, only to turn straight into the second and crash into it. Adding a head-on collision detection based on FOE detection and divergence estimation (e.g., [7]) should help avoid obstacles in these cases.

Successful run in a cluttered environment in the Cyber Zoo. The Crazyflie manages to avoid collision until the battery is drained.

All in all, we consider the result a successful demonstration of the optical flow CNN. In future work, we expect to see applications that take more advantage of the resolution of the flow information.

Citation

Bouwmeester, R. J., Paredes-Vallés, F., De Croon, G. C. H. E. (2022). NanoFlowNet: Real-time Dense Optical Flow on a Nano Quadcopter. arXiv. https://doi.org/10.48550/arXiv.2209.06918

References

[1] Gao, P., Zhang, D., Fang, Q., & Jin, S. (2017). Obstacle avoidance for micro quadrotor based on optical flow. Proceedings of the 29th Chinese Control and Decision Conference, CCDC 2017, 4033–4037. https://doi.org/10.1109/CCDC.2017.7979206

[2] Sanket, N. J., Singh, C. D., Ganguly, K., Fermuller, C., & Aloimonos, Y. (2018). GapFlyt: Active vision based minimalist structure-less gap detection for quadrotor flight. IEEE Robotics and Automation Letters, 3(4), 2799–2806. https://doi.org/10.1109/LRA.2018.2843445

[3] Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots, 27(3), 189–198. https://doi.org/10.1007/s10514-009-9140-0

[4] Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). MAV navigation through indoor corridors using optical flow. Proceedings – IEEE International Conference on Robotics and Automation, 3361–3368. https://doi.org/10.1109/ROBOT.2010.5509777

[5] De Croon, G. C. H. E. (2016). Monocular distance estimation with optical flow maneuvers and efference copies: A stability-based strategy. Bioinspiration and Biomimetics, 11(1). https://doi.org/10.1088/1748-3190/11/1/016004

[6] Serres, J. R., & Ruffier, F. (2017). Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure and Development, 46(5), 703–717. https://doi.org/10.1016/j.asd.2017.06.003

[7] De Croon, G. C. H. E., De Wagter, C., & Seidl, T. (2021). Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nature Machine Intelligence, 3(1), 33–41. https://doi.org/10.1038/s42256-020-00279-7

[8] Ranjan, A., & Black, M. J. (2017). Optical flow estimation using a spatial pyramid network. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 2720–2729. https://doi.org/10.1109/CVPR.2017.291

[9] Hui, T. W., Tang, X., & Loy, C. C. (2018). LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8981–8989. https://doi.org/10.1109/CVPR.2018.00936

[10] Sun, D., Yang, X., Liu, M. Y., & Kautz, J. (2017). PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8934–8943. https://doi.org/10.1109/CVPR.2018.00931

[11] Fan, M., Lai, S., Huang, J., Wei, X., Chai, Z., Luo, J., & Wei, X. (2021). Rethinking BiSeNet For Real-time Semantic Segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 9711–9720. https://doi.org/10.1109/CVPR46437.2021.00959

[12] Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. In arXiv. arXiv. http://arxiv.org/abs/1704.04861

[13] Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., & Brox, T. (2017). FlowNet 2.0: Evolution of optical flow estimation with deep networks. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 1647–1655. https://doi.org/10.1109/CVPR.2017.179

[14] Souhila, K., & Karim, A. (2007). Optical flow based robot obstacle avoidance. International Journal of Advanced Robotic Systems, 4(1), 2. https://doi.org/10.5772/5715

[15] Cho, G., Kim, J., & Oh, H. (2019). Vision-based obstacle avoidance strategies for MAVs using optical flows in 3-D textured environments. Sensors, 19(11), 2523. https://doi.org/10.3390/s19112523

Last week we went on a nice trip to Delft, The Netherlands to attend the 22th International Mico Aerial Vehicle Conference and Competition, this time organized by the MAVlab of the TU Delft. Me (Kim), Barbara and Kristoffer went there by train for our CO2 policy, although the Dutch train strikes did made it a bit difficult for us! Luckily we made it all in one piece and we had a great time, so we will tell you about our experiences… with a lot of videos!

First Conference day

For the conference days we were placed in the main aula building, so that everybody could drop by during the coffee breaks, right next to one of our collaborators, Matěj Karásek from Flapper Drones (also see this blog post)! In the big lecture hall paper talks were going on, along with interesting keynote speeches by Yiannis Aloimonos from University of Maryland and Antonio Franchi from TU Twente.

In between the talks and coffee breaks, we took the opportunity to hack around with tiny demos, for which the IMAV competition is a quite a good opportunity. Here you see a video of 4 Crazyflies flying around a Flapperdrone, all platforms are using the lighthouse positioning system.

The Nanoquadcopter challenge

The evening of the first day the first competition was planned, namely the nanoquadcopter challenge! In this challenge the goal was to autonomously fly a Crazyflie with an AIdeck and Flowdeck as far as possible through an obstacle field. 8 teams participated, and although most did offboard processing of the AIdeck’s camera streaming, the PULP team (first place) and Equipe Skyrats (3rd place) did all the processing onboard. The most exciting run was by brave CVAR-UPM team that managed to do pass through 4 gates while avoiding obstacles, for which they won a Special Achievement Award.

During the challenge, Barbara also gave a presentation about the Crazyflie while Kristoffer build up the lighthouse positioning system in the background in a record breaking 5 minutes to show a little demo. After the challenge, there were bites and drinks where we can talk with all the teams participating.

Here there is an overview video of the competition. Also there was an excellent stream during the event if you would like to see all the runs in detail + presentations by the teams, you’ll have have a full 3 yours of content, complemented by exciting commentary of Christophe de Wagter and Guido de Croon from the MAVlab. Thanks to all the teams for participating and giving such a nice show :)

The overview video
The full stream of the nanoquadcopter challenge

The Green House Challenge

On Wednesday, we were brought to Tomato world, which is a special green house for technology development in horticulture. Here is where the Greenhouse challenge, which was the 2nd indoor drone competition took place. The teams had to participate with their drone of choice to navigate through rows of tomato plants and find the sick variant. Unfortunately we could not be up close and personal as with the nanoquadcopter challenge, but yet again there was a great streaming service available so we were able to follow every step of the way, along with some great presentations by Flapper drones and PATS! drones among others. For the later we were actual challenged to an autonomous drone fight! Their PATS-x system is made and detect pest insects that are harmful for green house crops, so they wanted to see if they can catch a Crazyflie. You can see in the video here that they manage to do that, and although the Crazyflie lived, we are pretty sure that a real fly or moth wouldn’t. Luckily it was a friendly match so we all had fun!

PATS versus Crazyflie battle

Here is an overview of the Green house challenge. At the end you can also see a special demo by the PULP team successfully trying out their obstacle avoiding Crazyflie in between the tomato plants. Very impressive!

Last days and final notes

Due to the planned (but later cancelled) train strikes in the Netherlands, the full pack were not able to attend the full event unfortunately. In the end Barbara and I were able to experience the outdoor challenge, where much bigger drones had to carry packages into a large field outdoors. I myself was able to catch the first part of the last conference day, which included a keynote of Richard Bomprey (Royal Veterinary College), whose lab contributed to the mosquito-inspired Crazyflie flight paper published in Science 2 years ago.

We were happy to be at the IMAV this year, which marks as our first conference attendance as Bitcraze after the pandemic. It was quite amazing to see the teams trying to overcome the challenges of these competition, especially with the nanoquadcopter challenge. We would like to thank again Guido de Croon and Christophe de Wagter of the MAVlab for inviting us!

IMAV website: https://2022.imavs.org/

Crazyflie IMAV papers:

  • ‘Handling Pitch Variations for Visual Perception in MAVs: Synthetic Augmentation and State Fusion’ Cereda et al. (2022) [pdf]
  • ‘Seeing with sound; surface detection and avoidance by sensing self-generated noise‘, Wilshin et al. (2022)

Before the summer vacations, I had the opportunity to spend some time working on AI deck improvements (blog post). One of the goals I set was to get CRTP over WiFi working, and try to fix issues along the way. The idea was to put together a small example where you could fly the Crazyflie using the keyboard and see the streamed image along the way. This would require both CRTP to the Crazyflie (logging and commands) as well as CPX to the GAP8 for the images. Just before heading off to vacation I managed to get the demo working, this post is about the results and som of the things that changed.

Link drivers

When using the Crazyflie Python library you connect to a Crazyflie using a URI. The first part of the URI (i.e radio or usb) selects what link driver to use for the connection. For example radio://0/80/2M/E7E7E7E7E7 selects the radio link driver, USB dongle 0 and communication at 2Mbit on channel E7E7E7E7E7.

While working on this demo there were two major things changed in the link drivers. The first one was the implementation of the serial link (serial://) which is now using CPX for CRTP to the Crazyflie. The usecase for this link driver is to connect a Raspberry Pi via a serial port to the Crazyflie on a larger platform.

The second change was to add a new link driver for connecting to the Crazyflie via TCP. Using this link driver it’s possible to connect to the Crazyflie via the network. It’s also possible to get the underlying protocol, the CPX object, for using CPX directly. This is used for communicating with for example the GAP8 to get images.

In the new TCP link driver the URI starts with tcp:// and has either an IP or a host name, followed by the port. Here’s two examples:

  • tcp://aideck-AABBCCDD.local:5000
  • tcp://196.168.0.100:5000

Comparison with the Crazyradio PA

So can WiFi be used now instead of the Crazyradio PA? Well, it depends. Using WiFi will give you larger throughput but you will trade this for latency. In our tests the latency is both larger and very random. In the demo I fly with the Flow V2 deck, which means latency isn’t that much of an issue. But if you were to fly without positioning and just use a joystick, this would not work out.

The Demo!

Below is a video of some flying at our office, to try it out yourself have a look at the example code here. Although the demo was mostly intended for improving CPX, we’ve made use of it at the office to collect training data for the AI deck.

The Crazyflie with AIdeck during over WiFI controlled flight.

Improvements

Unfortunately I was a bit short on time and the changes for mDNS discovery never made it it. Because of this there’s no way to “scan” or discover AI decks, so to connect you will need to know the IP or the host name. For now you can retrieve that by connecting to your AI-deck equipped Crazyflie with the CFclient and look at the console tab.

A part from that there’s more improvements to be made, with a better structure for using CPX (more like the CRTP stack with functions) in the library and more examples. There’s also still a few bugs to iron out, for example there’s still the improved FPS and WiFi throughput issues.

IMAV 2022

Next week from 13th to 16th of September Barbara, Kristoffer and Kimberly will be present at the international Micro Aerial Vehicle Conference and Competition (IMAV) hosted by the MAVlab of the TU Delft in the Netherlands. One of the competitions is called the nano quadcopter challenge, where teams will program a Crazyflie + AI deck combo to navigate through an obstacle field, so we are excited to see what solutions will come out of that. If any of you happens to be at the conference/competition, drop by our table to say hello!

Last week we had the first ever Bitcraze DEV meeting! With about 10 participants, we covered a range of topics. The meeting was mostly focused around how to handle support and what the DEV meetings should be about. We also had a chance to get some feedback, and one of the points was sharing a bit more what we’re currently working on and what we might work on in the future. So in the light of that, this blogpost is about CPX (the Crazyflie Packet eXchange) protocol. We’ve mentioned CPX before (1, 2), but with this blogpost I want to share the current status and some thoughts on why we need something new.

As summer is approaching and things are winding down, I’m talking the opportunity to get back to the AI deck and CPX. The AI deck was officially released out of early access last month, but there’s still more work to be done with porting examples, adding some more functionality and increasing stability and performance.

For the AI deck we’re only supplying examples, there’s no functionality that will be used with the platform (except for the WiFi connection maybe). This is in contrast to for instance the Flow deck, where there’s a specified functionality the user can use and that should work. So in order to move forward I came up with a little demo that I want to get working during the summer. The goal is to make an application where I can fly around the Crazyflie with the keyboard and get a video stream back. To achieve this I’m using the Flow deck together with the AI deck and using WiFi for both CPX and CRTP (to send command and to get images and logging).

Why we need something new

I’ve written a post about CPX in the past (link) where I detailed the issues we are trying to solve. But in short we needed was a protocol that …

  • … could be routed though intermediaries to reach it’s destination
  • … could handle high transfer rates with large amounts of data as well as small messages
  • … could handle different memory budgets
  • … doesn’t drop data along the way if some parts of the system is loaded

As the Crazyflie echo system grows and becomes more complex we need new tools to work with it. When CRTP was implemented many years ago, the complexity we have today wasn’t something we could imagine. The Crazyflie had the only MCU and the hardware on the decks were used directly from it. Now we have multiple decks with more complex systems on them: AI deck (2 MCUs), Active marker deck (1 MCU) and the Lighthouse deck (1 FPGA). Looking forward these more complex decks might increase in the future. With more and more functionality in the Crazyflie and resources occupied, like DMA channels and pins, some functionality might need to move further out onto the decks.

For each deck new protocols are implemented and specific code is needed in the Crazyflie to handle it. Some things also become complex, like getting printouts from the different MCUs on the decks. So for the AI deck we wanted to test something new and more generic to see if it would be something we could use more in the future to talk directly to different MCUs in the system.

Will CPX replace CRTP? Probably not. We’re not sure what solution we will land in, but I think CPX is a good step in the right direction.

Current status

Back to my little demo. To reach the goal there’s a few things which needs to be fixed:

  • crazyflie-firmware/#1065: When starting to run CRTP over CPX (via WiFi) I’ve noticed that the UART2 driver was too slow, loading the system too heavily and creating problems down the line. So this is being worked on, and at the same time the old SYS-link over UART2 implementation is being moved to CPX instead.
  • aideck-esp-firmware/#12: We’ve had reports of intermittent performance issues for WiFi, which is also effecting.

Aside from the issues there’s also a few other features that are being added:

  • CRTP over CPX: Since I already have a connection for the images I also want to use this for controlling the Crazyflie. The latency is too high for controlling roll/pitch/yaw in real-time, but in my case I have the Flow deck for position control
  • CPX over CRTP: Although not part of the demo, this is interesting to look at for the future. One example is that right now we have an implementation where the Crazyflie firmware has a special implementation for the WiFi credentials. If we would like to set it from the ground we would first have to do CRTP to the Crazyflie, re-package it and then send it via CPX to the ESP32 on the AI deck. Instead I would like to send it via CPX directly from the ground, saving us extra work and complexity in the Crazyflie
  • Using Zeroconf/mDNS for finding AI decks: With this changes it will be possible to connect to the Crazyflie via the client, so we need a way to find the AI decks. For this Zeroconf/mDNS has been added, so AI decks will be automatically discovered on the local network.

The current status can be seen in the following draft PRs: crazyflie-firmware/#1068 and crazyflie-lib-python/#342. Note that until these are real PRs (not draft) they are not useful, so don’t try to use them yet.

CPX documentation

For more information on CPX and how it’s implemented, check out the documentation on our website we well as the specific documentation on using it from the GAP8.

We have worked hard last week to get a new fresh release out before the summer months are on our doorstep. Not only that we would like to make sure that important bugs are fixed before some of us go on our holiday, but also to be able to display our new AI deck features! Here is an overview of what has been changed

AI deck over air flashing

As you can probably see in the release notes of both the python libraries and the firmware, most of our changes are focused on making it possible to develop for the AI deck without using a programmer all the time. If the STM and NRF firmware of the Crazyflie is fully updated, and the ESP firmware on the AI deck, it should now be possible to flash an AI deck example binary with a Crazyradio! For older versions of the AI deck 1.X (Rev A to C) it is unfortunately still necessary to use the JTAG programmer one last time to flash a bootloader on the GAP8, but after that it should not be needed anymore.

Please check out the new update AI deck tutorial for setting up the AI deck for this new functionality.

Crazyflie Packet eXchange (CPX)

In the light of the work we have done for the AI deck, we also have started to implement a new, inter MCU protocol called the Crazyflie Packet Exchange. Since with the AI deck, we are adding 2 additional microprocessors to the Crazyflie architecture, it was crucial to handle the communication between all platforms and communication channels properly. Currently the functionality is mostly enabled to tailor Wifi streaming and console printouts for the AI deck, but it is meant to be a generic protocol which in the future, should be able to handle more combinations like for instance, command messages through wifi?

You can read about CPX in the crazyflie-firmware repository doc and we will be writing a more detailed blogpost about this later.

Controller Python bindings

For the last part of the Grand tour trip, we had a hackathon with the IMRC lab of TU Berlin and our close collaborator Wolfgang Hönig, in which we managed to convert the PID controller, Mellinger controller and the motor mixing into python bindings, which can be used in the experimental simulator of the Crazyflie.

There is no Pypi release of these, you will need to pull the latest crazyflie-firmware repo and build the bindings with ‘make bindings_python’

Additional fixes

We have some additional fixes to both the python libraries and firmware. For the STM we have updated the STD peripheral library and solved several build issues. For the cfclient, we fixed a lot of issues that were caused by either the latest version of python, as it was more stricter with type definitions, and some issues QT. Moreover, the LED ring headlight functionality has been restored, and the cfbridge.py script, used for the PX4 crazyflie 2.1 tutorial, is re-added, since it suddenly disappeared a few releases ago.

Update and Feedback

Make sure to update your cfclient with ‘pip install cfclient –upgrade’ and to reflash the new stable firmware. For AI deck users, try out our our new tutorial to try out both CPX, the over air flashing and the wifi example. The new AI deck functionalities has been subjected to some limited testing so if there is anything wrong or unclear, please let us know in the forum! The feedback will help the AI deck to become a more stable product for development, so we would be very grateful if you would be able to help out with that.

I know a lot of you will be too distracted by chocolate to read this post, so I will make it short.

I am, too, a little distracted by sugar

As I mentioned earlier, we’re a little under-staffed right now. Jonas left us for new adventures, and Arnaud is enjoying some time with his baby (here in Sweden parental leave is thankfully long for dads too). On top of that, Kimberly was away the last two weeks to visit various labs in Europe. She will talk to you about it once she’s back, I’m sure. But with just 4 people at the office, time is a valuable resource. So what are we doing with it?

Well, a lot of that has been dedicated to the AI deck, but that’s not the only thing we’ve been working on. Recently, we had the visit of one expert on dangerous goods shipment. During 2 days, we got to learn about how to properly send the batteries we have, the regulations that are involved and what we have to implement to ship them. It may sound boring… and honestly, it was not the most interesting. But we got a certification out of it, that now allows us to ship as many batteries as we want with your order ! The 2 batteries only restriction that we have on the shop should be lifted – but please be aware that if you exceed 2 batteries per Crazyflie, the shipping cost will be higher, because of the fee Fedex imposes on dangerous goods shipments.

And speaking of Fedex, there are some problems right now on their air routes. Avoiding Ukraine and dealing with some strikes for air traffic operators in Europe has not been easy on their infrastructure, and we have experienced some delays in deliveries unfortunately. It seems to go back to normal gradually, so let’s hope their usual speediness resumes soon.

We’re also working on the Mini BAMs, which is on the 18th of May and will talk about drones for aerial show. Our special guest speakers are from Collmot and Flapper Drones, make sure to answer this survey if you want to participate ! You will get more informations soon.

And if want to play around with the AI deck, you will have an interesting occasion in September. IMAV launched a competition, where the goal is to have the Crazyflie equipped with the AI deck perform vision-based obstacle avoidance at increasing speeds. Deadline for registering are Mid-May, you can find more informations here.

We are now enjoying a long Easter week-end, recharging our batteries with families (and chocolate!), hoping that the Swedish spring finally settles here. I hope you’re enjoying it too !

A lot has happened at Bitcraze over the last months, which left us quite short-staffed. Thankfully, Victor has joined us again for a while. He mainly works on finishing his thesis with us, and we all agree that having an extra person at the office feels nice – especially considering the exciting stuff he’s working on! But let’s hear it from him first:

“Hi! I’m Victor, 26 years old, and studying towards a bachelor’s degree in Computer Science and Computer Engineering at LTH. I worked at Bitcraze during the summers of 2019 and 2020 and I’m now doing my bachelor’s thesis here.
During this thesis I will make a prototype deck that combines multiple ToF solid state lidar’s (more specifically, the new VL53L5CX). While there exists the Multi-ranger deck today, this new sensor outputs a matrix of distances, which opens up new possibilities that the Multi-ranger can not. Onboard the deck, there will also be an ESP32-S3, which will collect the data from the sensors and then send it to the PC, either through the Crazyflie, or through WiFi. This is all super exciting stuff and has endless potential, so let’s see how far I will get!”

Meet Victor!

I’m sure you will hear more on his progress in the next months, so make sure to keep updated!

Stock issues

We’ve been dealing with the component shortage as good as we can, but production is still unpredictable. Sadly, it means the impact on our stock is too. . The AI deck, the Bolt and the battery chargers are unfortunately out of stock right now. We had to change slightly the Swarm bundles to adjust to the lack of chargers. We’re also low on Multi Rangers, which are expected to run out of stock next week.

All those products are expected back by mid-May, if luck is on our side. It depends on our manufacturer in China, where there is sadly a new Corona outbreak, so it’s not easy to say for sure if this estimation is accurate. We hope that production and delivery stay unimpacted. Just know that we are working on getting everything back on stock as soon as possible. If you want to stay updated on the status of one of our out-of-stock product, you can choose to be informed by mail in our webshop. Just go to the product’s page, and put your email there: you’ll be the first one to know when it’s back in stock !

During the last couple of months we’ve been working on getting the AI-deck out of early-access. One of the things needed for this to happen is an improved infrastructure for the AI-deck, like bootloading and how the deck fits into the rest of the eco-system. For this there’s two new repositories:

CPX (Crazyflie Packet eXchange)

With the addition of two new MCUs (ESP32 and GAP8) as well as the possibility to connect via the WiFi, we quickly run into the issue of how to communicate between the targets. Even more so since there’s no direct access between some of these (like the Crazyflie<->WiFi, Crazyflie<->GAP8, GAP8<->WiFi).

What we needed was a protocol that …

  • … could be routed though intermediaries to reach it’s destination
  • … could handle high transfer rates with large amounts of data as well as small messages
  • … could handle different memory budgets
  • … doesn’t drop data along the way if some parts of the system is loaded

We decided to design and implement a new protocol, which we’ve named CPX (Crazyflie Packet eXchange). The protocol solves the issues above by:

  • Each packet has a source and destination ID, so it can be routed to (and from) the target of the packet
  • Each link between targets can have it’s own MTU, which allows each target to optimize memory usage. In order to handle this, intermediaries are allowed to split packages along the way, so data can be transferred in smaller pieces.
  • Instead of dropping packages if targets become overloaded, congestion in created in the system, where the sender will not be able to send more data until the receiver has been able to handle it.

Currently the new protocol is used for the GAP8 bootloader, for setting up the WiFi on the ESP32 and in the WiFi streamer example. But we’re hoping to expand it in the future to include more functionality, like logging and other plumbing that could be used in user applications.

WiFi configuration changes

With CPX it’s now possible to set up the WiFi from either the Crazyflie, the GAP8 or from the ESP32 itself. For doing this from the Crazyflie we’ve added the option of configuring this using KConfig, where we’ve added the following options in the expansion decks menu for the AI-deck:

  • Do not set-up the WiFi: Should be used if another target is setting up the WiFi, like the GAP8
  • Act as an access point: This will make it possible to connect to the AI-deck as an access point
  • Connect to an access point: This will connect the AI-deck to an access point using SSID/PASSWD entered in the menuconfig

GAP 8 bootloader

To make things easier for the user we want to remove the requirement of using a JTAG dongle to program the GAP8. In order to achieve this we’ve implemented a bootloader for the GAP8 which uses CPX, which means it can be used either from the Crazyflie or over the WiFi. We still haven’t had time to implement the Crazyflie part, where this will fit nicely together with the cload and client deck firmware upgrade, but it’s currently working via the WiFi. So until the implementation is done via the Crazyflie Python library, this script can be used to bootload and start your custom GAP8 firmware. Note though, that you will first have to flash the GAP8 bootloader and set-up the WiFi.

What’s next?

We’re continuing working towards getting the AI-deck out of early access. For CPX and the GAP8 bootloader there’s still a few bugs to iron out and examples to be updated as well as improved support for building using our toolbelt.