Category: Crazyflie

Welcome to the “The Beginner’s Guide to Drones” for programmers curious about exploring the world of unmanned aerial vehicles (UAVs). If you’re a coder from another field, this guide will walk you through the basics of drones, their components, and how to start programming them. Let’s dive in and see how your coding skills can take flight!

If you come from an engineering field, you might already know the basics of some of these topics, however you might still have use of the overview and can use the resources to get more specific knowledge.

The Robotics part

First and foremost, you’ll need some basic robotics skills.
We start of with the most basic question, “What is a Robot?”. A robot uses sensors to create an internal model of its environment, and actuators to act on/in its environment. The specifics of the internal model depend on the robot’s purpose, but a crucial component is understanding its location and orientation within that environment.

Linear algebra basics

To understand how a quadcopter perceives its environment and its own position, you’ll need some basic skills in linear algebra, particularly in matrices, vectors, and frame rotation. These skills are essential for comprehending the mathematical principles behind quadcopter navigation.

To build an internal model of its own movement and orientation, an Inertial Measurement Unit (IMU) is used. An IMU consists of a gyroscope, an accelerometer, and sometimes a magnetometer. These sensors, when combined using sensor fusion techniques (See “Control Theroy” below), help determine the quadcopter’s angular velocity and linear acceleration. This data allows the drone to calculate its orientation and movement.

Picture of a Crazyflie in relation to a reference frame from Bitcraze.io

For a detailed understanding of these processes, please refer to the following resources:
Visual description of frame transformations;
https://www.youtube.com/watch?v=kYB8IZa5AuE
Basic coordinate transformations;
https://motion.cs.illinois.edu/RoboticSystems/CoordinateTransformations.htmlbasi

Positioning techniques

The quadcopter can now determine its relationship to its starting position and the gravitational field. However, relying solely on an IMU tends to cause drift over time. Imagine trying to stand on one leg with your eyes closed—eventually, you’ll lose balance.

For improved stability, a drone often needs additional sensors, such as a camera, to help stabilize its position. Other sensor systems can also be used to determine relative or absolute position. While an IMU can sense changes in position relative to a starting point, an external positioning system is necessary for stability and obtaining absolute positions. This system acts as a reference frame for the drone.

Drones flying outdoors typically use GPS combined with RTCM, since it is available almost anywhere, ease to use, and has centimeter-level accuracy.

For indoor use, as with Crazyflies, the default used positioning system is motion-capture system but there are others as well. This area is at the cutting edge of science, with new technologies emerging constantly. However, many effective systems are available, though they may have constraints regarding power efficiency, flight area size, update speed, or precision.

This is an image of some common techniques for positioning. a) AOA, b) TOA, TWR c) TDOA taken from Researchgate.com

Control theory

Now that drone can understand its position and orientation in space, the next step is figuring out how to move within this space. Moving from point A to point B involves setting a “setpoint” and then determining how to use the drone’s actuators to reach this setpoint most efficiently. This is where control theory comes into play.

Drones generally use some sort of feedback control system, which in its most basic form looks something like this:

A very basic overview of a system with feedback control

In this system, the error (the difference between the current position and the setpoint) is fed back into the system to ensure the drone moves in a way that minimizes the error over time.

Various algorithms can calculate the best actuator output based on the error and current state. One of the most fundamental algorithms is the PID controller, which works well for linear systems. Understanding PID controllers requires some basic calculus, but the concept is straightforward. Here are some resources for simple explanations:

For IMUs, there is a particularly useful filter to know about, given its widespread use. The accelerometer and gyroscope each have their own profiles of noise and drift. The accelerometer is sensitive to short-term noise, while the gyroscope drifts slowly over time. To mitigate these issues, a combination of both measurements is often used. The complementary filter is ideal for this situation, leveraging the strengths of both sensors to correct the measurements effectively.

More information on how to use the complementary filter for IMUs can be found here:
https://www.hackster.io/hibit/complementary-filter-and-relative-orientation-with-mpu9250-d4f79d

For more complex scenarios, advanced controllers like Kalman filters and others can be used. It’s also possible to combine multiple controllers to achieve better performance.

Basic overview of feedback control systems;
https://control.com/textbook/closed-loop-control/basic-feedback-control-principles/

The flying part

Now lets get into the exciting part. FLYING!!!

Actuators

Drone actuators, primarily consisting of motors and propellers, are critical for controlling a drone’s movement and stability. The motors and propellers are typically called the “drive train” or “power train”.
The motors used on drones are usually brushed or brushless DC motors. Propellers are attached to the motors and generate lift by pushing air downwards. The size, shape, and pitch of the propellers affect the drone’s performance, including speed, lift, and maneuverability. Together, the precise control of motors and propellers enables a drone to perform complex maneuvers, maintain stability, and achieve efficient flight.

Pictures of forces acting on a drone in flight


This is an almost all you need guide to get an overview of drone flight dynamics
https://dronstechnology.com/the-physics-of-drone-flight-lift-thrust-drag-and-weight/

Stock information

As of now the Crazyflie 2.1 is out of stock, unfortunately. They’re expected back in stock around August 20th – 4 weeks from now. You can sign up in our shop to be notified as soon as they arrive!
https://store.bitcraze.io/collections/kits/products/crazyflie-2-1?variant=19575412719703

Today we welcome Sam Schoedel and Khai Nguyen from Carnegie Mellon University. Enjoy!

We’re excited to share the research we’ve been doing on model-predictive control (MPC) for tiny robots! Our goal was to find a way to compress an MPC solver to a size that would fit on common microcontrollers like the Crazyflie’s STM32F405 while being fast enough to control the higher frequency dynamics of smaller robots. We came up with a few tricks to make that happen and dubbed the resulting solver TinyMPC. When it came time for hardware experiments, using the Crazyflie just made sense. A tiny solver deserves a tiny robot.

Motivation

Model predictive control is a powerful tool for controlling complex systems, but it is computationally expensive and thus often limited to use cases where the robot can either carry enough computational power or when offboard computing is available. The problem becomes challenging to solve for small robots, especially when we want to perform all of the computation onboard. Smaller robots have inherently faster dynamics which require higher frequency controllers to stabilize, and because of their size they don’t have the capacity to haul around as much computational power as their larger robot counterparts. The computers they can carry are often highly memory-constrained as well. Our question was “how can we shrink the computational complexity and memory costs of MPC down to the scale of tiny robots?”

What We Did

We settled on developing a convex model predictive control solver based on the alternating direction method of multipliers. Convex MPC solvers are limited to reasoning about linear dynamics (on top of any other convex constraints), but have structure that TinyMPC exploits to solve problems efficiently. The tricks we used to achieve this efficiency are described in the paper, but it boils down to rewriting the problem as a constrained linear-quadratic regulator to reduce the memory footprint and then precomputing as many matrices as possible offline so that online calculations are less expensive. These tricks allowed us to fit long-time horizon MPC problems on the Crazyflie and solve them fast enough for real-time use.

What TinyMPC Can Do

We decided to demonstrate the constraint-handling capabilities of TinyMPC by having the Crazyflie avoid a dynamic obstacle. We achieved this by re-computing hyperplane constraints (green planes in the first video) about a spherical obstacle (transparent white ball) for each knot point in the trajectory at every time step, and then by solving the problem with the new constraints assuming they stayed fixed for the duration of the solve.

In the two videos below, the reference trajectory used by the solver is just a hover position at the origin for every time step. Also, the path the robot takes in the real world will never be exactly the same as the trajectory computed by the solver, which can easily result in collisions. To avoid this, we inflated the end of the stick (and the simulated obstacle) to act as a keep-out region.

TinyMPC is restricted to reasoning about linear dynamics because of its convex formulation. However, a simple linearization can be taken pretty far. We experimented with recovering from different starting conditions to push the limits of our linear Crazyflie model and were able to successfully recover from a 90 degree angle while obeying the thrust commands for each motor.

We recently added support for second-order cone constraints as well. These types of constraints allow TinyMPC to reason about friction and thrust cones, for example, which means it can now intelligently control quadrupeds on slippery surfaces and land rockets. To clearly demonstrate the cone constraint, we took long exposure photos of the Crazyflie tracking a cylindrical landing trajectory without any cone constraints (red) and then with a spatial cone constraint that restricts the landing maneuver to a glide slope (blue).

How To Use TinyMPC

All of the information regarding the solver can be found on our website and GitHub org (which is where you can also find the main GitHub repository). TinyMPC currently has a Python wrapper that allows for validating the solver and generating C++ code to run on a robot, and we have a few examples in C++ if you don’t want to use Python. Our website also explains how to linearize your robot and has some examples for setting up the problem with a linear model, solving it an MPC loop, and then generating and running C++ code.

Most importantly to the Crazyflie community, our TinyMPC-integrated firmware is available and should work out of the box. Let us know if you use it and run into issues!

Our accompanying research papers:

Khai Nguyen, Sam Schoedel, Anoushka Alavilli, Brian Plancher, and Zachary Manchester. “TinyMPC: Model-Predictive Control on Resource-Constrained Microcontrollers.” arXiv preprint arXiv:2310.16985 (2023). https://arxiv.org/pdf/2310.16985

Sam Schoedel, Khai Nguyen, Elakhya Nedumaran, Brian Plancher, and Zachary Manchester. “Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC.” arXiv preprint arXiv:2403.18149 (2024). https://arxiv.org/pdf/2403.18149

We would love your feedback and suggestions, and let us know if you use TinyMPC for your tiny platforms!

We are happy to announce the latest updates to the Crazyflie client and Python library. Major changes include improved persistent parameter management, enhanced plotting with new x-axis manipulation features, and new default logging configurations (for PID tuning). Minor updates include bug fixes and documentation improvements.

Updated plotter tab. Besides the existing option for a number of samples, users can now set x-axis limits to a number of seconds or a time range.
Updated parameters tab. Users can now mass dump/load persistent parameters to/from a file, or clear all stored persistent parameters.

For detailed release notes, check out crazyflie-lib-python release 0.1.26 and crazyflie-clients-python release 2024.7.

Whenever we show the Crazyflie at our booth at various robotics conferences (like the recent ICRA Yokohama), we sometimes get comments like ‘ahh that’s cute’ or ‘that’s a fun toy!’. Those who have been working with it for their research know differently, but it seems that the general robotics crowd needs a little bit more… convincing! Disregarding its size, the Crazyflie is a great tool that enables users to do many awesome things in various areas of robotics, such as swarm robotics and autonomy, for both research and education.

We will be showing that off by giving a live tutorial and demonstration at the Robotics Developer Day 2024, which is organized by The Construct and will take place this Friday, 5th of July. We have a discount code for you to use if you want to get a ticket; scroll down for details. The code can be used until 12 am midnight (CEST) on the 2nd of July.

The Construct and Robotics Developer Day 2024

So a bit of background information: The Construct is an online platform that offers various courses and curriculums to teach robotics and ROS to their users. Along with that, they also organize all kinds of live training sessions and events like the Robotics Developer Day and the ROS Awards. Unfortunately, the deadline for voting in the latter has passed, but hopefully in the future, the Crazyflie might get an award of its own!

What stands out about the platform is its implementation of web-based virtual machines, called ‘ROSJects,’ where ROS and everything needed for it is already set up from the start. Anyone who has worked with ROS(2) before knows that it can be a pain to switch between different versions of ROS and Gazebo, so this feature allows users to keep those projects separate. For the ROS Developer Day, there will be about five live skill-learning sessions where a ROSject is already preconfigured and set up for the attendees, enabling them to try the tutorial simultaneously as the teacher or speaker explains the framework.

Skill learning session with the Crazyflie

One of the earlier mentioned skill learning sessions is, of course, one with the Crazyflie! The title is “ROS 2 with a Tiny Quadcopter,” and it is currently planned to be the first skill learning session of the event, scheduled at 15:15 (3:15 pm) CEST. The talk will emphasize the use of simulation in the development process with aerial robotics and iterating between the real platform and the simulated one. We will demonstrate this with a Crazyflie 2.1 equipped with a Lighthouse deck and a Multi-ranger deck. Moreover, it will also use a Qi-charging deck on a charging platform while it patiently waits for its turn :D

What we will be showing is a simple implementation of a mapping algorithm made specifically for the Crazyflie’s Multiranger deck, which we have demonstrated before at ROSCon Kyoto and in the Crazyswarm2 tutorials. What is especially different this time is that we are using Gazebo for the simulation parts, which required some skill learning on our side as we have been used to Webots over the last couple of years (see our tutorial for that). You can find the files for the simulation part in this repository, but we do advise you to follow the session first.

You can, if you want, follow along with the tutorial using a Crazyflie yourself. If you have a Crazyflie, Crazyradio, and a positioning deck (preferably Lighthouse positioning, but a Flowdeck would work as well), you can try out the real-platform part of this tutorial. You will need to install Crazyswarm2 on a separate Ubuntu machine and add a robot in your ROSject as preparation. However, this is entirely optional, and it might distract you from the cool demos we are planning to show, so perhaps you can try this as a recap after the actual skill learning session ;).

Here is a teaser of what the final stage of the tutorial will look like:

Win a lighthouse explorer bundle and a Hands-On Pass discount

We are also sponsors of the event and have agreed with The Construct to award one of the participants a Crazyflie if they win any contest. Specifically, we will be awarding a Lighthouse Explorer bundle, with a Qi deck and a custom-made charging pad similar to the ones we show at fairs like ICRA this year. So make sure to participate in the contests during the day for a chance to win this or any of the other prizes they have!

It is possible to follow the event for free, but if you’d like to participate with the ROSjects, you’ll need to get a hands-on pass. If you haven’t yet gotten a hands-on ticket for the Robotics Developer Day, please use our 50% off discount code:

19ACC2C9

This code is valid until the 2nd of July, 12 am (midnight) Central European Time! Buy your ticket on the event’s website: https://www.theconstruct.ai/robotics-developers-day/

RSS 2024 aerial swarm workshop

On a side note, we will be at the Robotics: Science and Systems Conference in Delft from July 15th to 19th, 2024—just about two weeks from now. We won’t have a booth as we usually do, but we will be co-organizing a half-day workshop titled Aerial Swarm Tools and Applications (more details on this website).

We will be organizing this workshop together with our collaborators at Crazyswarm2, as well as the developers of CrazyChoir and Aerostack2. We’re excited to showcase demos of these frameworks with a bunch of actual Crazyflies during the workshop, if the demo gods are on our side :D. We will also have great speakers, including: SiQi Zhou (TU Munich), Martin Saska (Czech Technical University), Sabine Hauert (University of Bristol), and Gábor Vásárhelyi (Collmot/Eötvös University).

Hope to see you there!

This week we have a guest blogpost by Kamil Masalimov (MSc) and Tagir Muslimov (PhD) of the Ufa University of Science and Technology. Enjoy!

As researchers passionate about UAV technology, we are excited to share our recent findings on how structural defects affect the performance of nano-quadcopters. Our study, titled “CrazyPAD: A Dataset for Assessing the Impact of Structural Defects on Nano-Quadcopter Performance,” offers comprehensive insights that could greatly benefit the Crazyflie community and the broader UAV industry.

The Motivation Behind Our Research

Understanding the nuances of how structural defects impact UAV performance is crucial for advancing the design, testing, and maintenance of these devices. Even minor imperfections can lead to significant changes in flight behavior, affecting stability, power consumption, and control responsiveness. Our goal was to create a robust dataset (CrazyPAD) that documents these effects and can be used for further research and development.

Key Findings from Our Study

We conducted a series of experiments by introducing various defects, such as added weights and propeller cuts (Figure 1), to nano-quadcopters. For the experiments, we used the Lighthouse Positioning System with two SteamVR 2.0 virtual reality stations (Figure 2).

Figure 1. Propeller with two side defects
Figure 2. Schematic of the experimental setup with Lighthouse Positioning System

Here are some of the pivotal findings from our research:

  1. Stability Impact: We observed that both added weights and propeller cuts lead to noticeable changes in the stability of the quadcopter. Larger defects caused greater instability, emphasizing the importance of precise manufacturing and regular maintenance.
  2. Increased Power Consumption: Our experiments showed that structural defects result in higher power consumption. This insight is vital for optimizing battery life and enhancing energy efficiency during flights.
  3. Variable Control Responsiveness: We used the standard deviation of thrust commands as a measure of control responsiveness. The results indicated that defects increased the variability of control inputs, which could affect maneuverability and flight precision.
  4. Changes in Roll and Pitch Rates: The study also highlighted variations in roll and pitch rates due to structural defects, providing a deeper understanding of how these imperfections impact flight dynamics.

We show Figure 3 as an example of a graph obtained from our dataset. In this figure, you can see the altitude and thrust command over time for different flight conditions. The blue line represents the normal flight, while the orange line represents the flight with additional weight near the M3 propeller. In Figure 4, you can see the 3D flight trajectory of the Crazyflie 2.1 quadcopter under the cut_propeller_M3_2mm condition with the corrected ideal path. The blue line represents the actual flight trajectory, while the red dashed line with markers represents the ideal trajectory. Figure 5 shows the Motor PWM values over time for the add_weight_W1_near_M3 condition. The plot shows the PWM values of each motor (M1, M2, M3, and M4) as they respond to the added weight near the M3 propeller.

More examples of graphs obtained from the CrazyPAD dataset can be found in our research paper specifically describing this dataset: https://doi.org/10.3390/data9060079

Figure 3. Altitude and thrust command over time for different flight conditions
Figure 4. 3D flight trajectory of the Crazyflie 2.1
Figure 5. Motor PWM values over time

Leveraging Research for Diagnostic and Predictive Models

One of the most exciting aspects of our research is its potential application in developing diagnostic and predictive models. The CrazyPAD dataset can be utilized to train machine learning algorithms that detect and predict structural defects in real-time. By analyzing flight data, these models can identify early signs of wear and tear, allowing for proactive maintenance and reducing the risk of in-flight failures.

Diagnostic models can continuously monitor the performance of a UAV, identifying anomalies and pinpointing potential defects. This real-time monitoring can significantly enhance the reliability and safety of UAV operations.

Predictive models can forecast future defects based on historical flight data. By anticipating when and where defects are likely to occur, these models can inform maintenance schedules, ensuring UAVs are serviced before issues become critical.

Why This Matters for the Crazyflie Community

The CrazyPAD dataset and our findings offer valuable resources for the Crazyflie community. By understanding how different defects affect flight performance, developers and enthusiasts can improve design protocols, enhance testing procedures, and ensure higher safety and performance standards for their UAVs.

We believe that sharing our research with the Crazyflie community can lead to significant advancements in UAV technology. The dataset we created is open under the MIT License for further exploration and can serve as a foundation for new innovations and improvements.

Get Involved and Explore Further

We invite community members to explore our full research article and the CrazyPAD dataset. Together, we can drive forward the standards of UAV technology, ensuring that Crazyflie remains at the forefront of innovation and excellence.

Our research paper with a detailed description of this dataset:

Masalimov, K.; Muslimov, T.; Kozlov, E.; Munasypov, R. CrazyPAD: A Dataset for Assessing the Impact of Structural Defects on Nano-Quadcopter Performance. Data 2024, 9, 79. https://doi.org/10.3390/data9060079

Dataset:  https://github.com/AerialRoboticsUUST/CrazyPAD

We are eager to collaborate with the Crazyflie community and welcome any feedback or questions regarding our research. Let’s work together to push the boundaries of what’s possible in UAV technology.

Two weeks ago, Arnaud, Barbara and Rik were at ICRA 2024 in Yokohama. At our booth we showed our current products as well as the upcoming brushless Crazyflie and the camera deck prototype.

As usual, the conference has been very busy with a lot of visitors and a lot of very interesting discussion. Thanks to everyone that passed by the booth, we have come back to Sweden with a lot of energy and new ideas!

The autonomous lighthouse swarm demo demo has worked pretty well. If you are interested to know more about it you can visit our event page. It is an autonomous demo with 3 brushless Crazyflies and 6 Crazyflie 2.1s flying autonomously. With the extended battery life of the brushless, we could operate the demo pretty much continuously.

If you’ve been at the conference, you may have spotted someone proudly sporting our exclusive ‘Bitcraze took my poster’ button. We’re excited to have received posters covering a wide range of topics, the walls of our office are eagerly awaiting these visual representations of your hard work and dedication. Thank you to everyone who has contributed.

One of the great features of the stock Crazyflie 2.1 is that it is more or less harmless. The Crazyflie 2.1 brushless weighs roughly the same but has almost twice the amount of thrust force, so a little bit of more care is needed. We therefore decided to provide optional propeller guards. While propeller guards adds safety they also add weight and disrupt the air flow from the propellers. Adding to that, the weight is located far from the center which increases the inertia even further, resulting in a less agile drone. For some applications this is not a problem but for others it is, this is why we are making them optional, meaning they are easy to replace with simple landing legs by utilizing a snap-on fitting.

The design is not fully finalized yet but we are getting close, voilá!

If the design goes according to plan they will also withstand some bumping against walls which will be a very nice feature for many applications.

Further the landing legs and propeller guards are designed in a way so they will detach during high force impacts to prevent the PCB arms from breaking.

“What? You are in Japan? Again!?”. Yup that is right! We loved IROS Kyoto 2022 so much that we just couldn’t wait to come back again. Barbara, Arnaud and Rik are setting up the booth as we speak to show some Bitcraze awesomeness to you! Come and say hi at booth IC085.

The gang before the rush starts!

Crazyflie Brushless and Camera expansion

Of all the prototypes we are the most excited of showing you the Crazyflie Brushless and the ‘forward facing expansion connector prototype’ aka the Camera deck. Here you can see them both in action at a tryout of our demo. We have also written blogposts about both so make sure to read them as well (Brushless blogpost, Camera expansion blogpost)

The Crazyflie Brushless flying with a Camera deck.

Also we will explain about the contact charging prototype (see the blogpost here) and will be showing all of our decks at the booth as well. And of course our fully autonomous, onboard, decentralized peer-to-peer and avoiding swarm demo will be displayed as always. Make sure to read this blogpost of when we showed this demo at IROS 2022 to understand what is fully going on!

Also take a look at our event page of the ICRA 2024 demo.

Hand in your Crazyflie posters at our booth!

We will be providing a ‘special disposal service’ for your conference poster! We would love to see what you are working on and get your poster, because we have a lot of space in our updated office/flight space but a lot of empty walls.

If you hand in your poster at the booth, you’ll get a special, one-of-a-kind, button badge that you can wear proudly during the conference! So we will see you at booth IC085!

The ‘Bitcraze took my poster’ button!

A great feature of the Crazyflie is its ability to keep evolving, both by using software but also through hardware expansions. Hardware expansions allow us and our users to keep exploring new problems and doing new experiments, without having to change the flying base. Over the years lots of new decks have been released and we’ve seen lots of users building their own custom electronics and attaching it to their Crazyflies. Although very versatile, the current deck system is limited to up/downwards facing expansions. Adding electronics that face forward, like a camera, has been harder and has required additional mechanics. Over the last couple of months we’ve been experimenting with a new expansion connector aiming at solving this issue. The idea is to be able to add a new class of expansions facing forward. This week’s blog post and next week’s developer meeting are about these experiments.

Goals and design

We’re always trying to find ways to make our platform more versatile, making it easier to expand and to be used in new ways. So we’ve been looking for a new way to be able to expand the platform even more, this time with electronics facing forward instead of up/down. The goal is to easily be able to add things such as cameras, ranging sensors etc. Making a custom deck with custom mechanics for each sensor hasn’t been a good solution, it takes lots of time and it doesn’t enable our users to do their own custom electronics. Finding a generic solution is hard since we’re constrained both in space and in weight. We need a solution which is very small and light, each gram adding cuts into the flight time. The solution also needs a way to handle the data generated from cameras/sensors as well as possibly to stream it over a faster connection than the Crazyradio.

Our current prototype is made of two parts, a new deck with WiFi and more computational power as well as several smaller expansions which can be added to it. The expansions fit straight into a small right angle connector, making it easy to change boards. The current connector we’re testing has 30 positions, of which 6 is used for power and 1 for 1-wire, leaving 24 pins for signaling. The 1-wire works the same way as our current decks, additional added hardware is auto-detected at startup and can be used without recompiling or reconfiguration.

The current prototype uses an ESP32-S3 and weighs in at 3.7 grams. Added to this there are a number of expansions that we’re evaluating:

  • OV2640 + VL53L5CX: RGB camera and ranging sensor (1.6 grams)
  • Flir Lepton 3.5: Thermal camera (2.1 grams)
  • MLX90640: Thermal camera (2.0 grams)

So the current prototype with RGB camera (and ranging sensor) weights in at a total of 5.3 grams (0.9 grams more than the AI deck).

Current status and continuation

We’re currently experimenting with connectors, modules and dimensions. In the coming months we will try to get more flight time to test the solution and we’re hoping to get some feedback from our users. So please post any comments and/or suggestions you might have.

If you’re interested in knowing more and discussing this then join our developer meeting next week on Wednesday. We will also be showing off the prototypes at ICRA, so make sure to swing by the booth if you’re attending.

This week we have a guest blogpost by Christian Llanes, a Robotics PhD of from Formal Methods & Autonomous Control of Transportation Systems Lab of the Georgia Institute of Technology. Enjoy!

Why do we need simulators?

Simulators are one of the most important tools used in robotics research. They usually are designed for different purposes with different levels of complexity. For example, simulators with low computational overhead that are parallelizable are mainly used for either training reinforcement learning algorithms or Monte Carlo sampling for verification of task completion in a nondeterministic environment. Some simulators also use rendering engines for the graphical display of models and the environment or when cameras are intended to be used in the robotics platform. Simulation is also useful for the development and deployment of new robotics firmware features where the firmware is compiled on a test machine and run in the loop with a simulated sensor suite. This simulator configuration is known as software-in-the-loop (SITL) because the vehicle firmware is intended to be run in the loop with the simulated vehicle physics and/or rendering engine. This feature is supported by autopilot suites such as PX4ArduPilotCogniPilot, and BetaFlight. This feature is not officially supported yet for Crazyflies because it requires a large overhaul of the firmware to be able to compile on a desktop machine and interact with different simulators such as Gazebo, Webots, PyBullet, CoppeliaSim, Isaac Sim, or Unreal Engine.

CrazySim

Last summer I began working with Crazyflies and noticed this Crazyflie simulator gap. I stumbled on a community-developed project for Crazyflie SITL called sim_cf. This project is exactly what I was looking for. However, the firmware used by the project is from July 2019 and the official firmware has had over 2000 commits made since then. The project also uses ROS 1, Gazebo Classic, and doesn’t support the Crazyflie Python library (CFLib). Using this project as a starting point I set out to develop CrazySim–a Crazyflie SITL project that doesn’t require ROS, uses Gazebo Sim, and supports connectivity through CFLib. Using CFLib we can connect the simulator to external software such as Crazyswarm2 or the Crazyflie ground station client. Users test their control algorithms in the external software using the simulator interface before deploying to real flight hardware.

An example of offboard model predictive control design and deployment workflow using CrazySim.

Using the Crazyflie Client for PID Tuning

We have also provided a modified Crazyflie client for CrazySim support. The Crazyflie client is a cool tool for testing a single drone in hardware. We can perform command based flight control, look at real time plots, save log data, and tune PID values in real time. The PID values are typically tuned for an out of the box Crazyflie. However, when we modify the Crazyflie and add extra weight through batteries, decks, and upgraded thrust motors then the behavior of the Crazyflie will change. If a user wants to tune a custom Crazyflie setup, then they can add additional models in this folder with their own motor and mass properties. Then they just need to add it to the list of supported models in either of the launch scripts. There is already an example model for the thrust upgrade bundle. Documentation for installing the custom client can be found here.

PID tuning a simulated Crazyflie using CrazySim on the Crazyflie PC client.

Crazyswarm2

We can now connect to the simulated Crazyflie firmware using CFLib. Therefore, we can set up a ROS 2 interface through Crazyswarm2 for swarm command and control through ROS 2 topics and services. To do this we first startup the drones using any of the launch scripts.

bash tools/crazyflie-simulation/simulator_files/gazebo/launch/sitl_multiagent_square.sh -n 16 -m crazyflie

Then, we bring up Crazyswarm2 after setting up the configuration file for the number of drones chosen.

ros2 launch crazyflie launch.py backend:=cflib

We demonstrate an example of how we can control a swarm of drones using Crazyswarm2 GoTo service commands.

Crazyswarm2 GoTo service commands using CrazySim.

ICRA 2024

CrazySim is also being presented as a paper at the 2024 IEEE International Conference on Robotics and Automation in Yokohama, Japan. If you are attending this conference and are interested in this work, then I invite you to my presentation and let me know that you are coming from this blog post after. For the paper, I created a multi agent decentralized model predictive controller (MPC) case study on ROS 2 to demonstrate the CrazySim simulation to hardware deployment workflow. Simulating larger swarms with MPC may require a high performance computer. The simulations in this work were performed on an AMD Ryzen 9 5950X desktop processor.

Model predictive control case study for ICRA 2024 paper.

Links

  1. CrazySim
  2. Modified Crazyflie client

Other Crazyflie SITL projects:

  1. sim_cf
  2. sim_cf2 blog post
  3. LambdaFlight blog post