crazyflie

This week we wanted to reflect on the progress that has been made lately in the Crazyflie ecosystem which will lead to bigger and better Crazyflie Swarms.

Radio communication

Like pointed out in the last blog post about Building a Crazyflie Flower Swarm with Rust, the new Rust Crazyflie library together with the new Crazyradio 2.0 has improved connection time and link efficiency by quite a bit.

It is now possible to connect swarms of multiple dozens of Crazyflies in seconds using a single radio and then make them fly while still getting position telemetry. So many Crazyflie on one radio does limit the maximum bandwidth per Crazyflie, but it does now work in a stable way!

Color LED deck

The recently released Color LED deck is a great addition to the ecosystem towards swarm. Its predecessor, the Led-ring Deck, has been used a lot by researchers to indicate state of individual Crazyflies in a Swarm. The Color LED Deck improves on that by providing a diffuser that allows to see the color from the side. This allows to mark states of big groups of Crazyflie much more clearly.

As a bonus, the Color LED Deck is very usable in other field like art and shows since it is much more visible and can be used to fly Crazyflies as “Flying Pixels”.

Autonomous landing and charging

Last year, we have released a Crazyflie 2.1 Brushless charging dock. This is a produced version of an idea we have been using with Crazyflie 2.1 and the Qi deck for years at fairs and conferences. It allows Crazyflies to autonomously land and charge. It is not only great for autonomous drone demos and shows but it also is a great waiting spots for swarms when doing research: the charging dock keeps the swarm charged so that when it is time to take off all the individuals starts with the same battery level.

Future endeavors

On the radio side there are still areas that would bring great improvement on communication stability. We are for example working on a channel-hopping communication protocol that should make the connection mostly immune to regular interference on 2.4GHz.

We are also working at improving other parts of swarm management, this includes for example solving the problem of flashing a full swarm of Crazyflie with the same firmware: we may be able to use broadcast messages more in order to drastically speed up the process instead of flashing the Crazyflie one per one.

Overall, working on bigger swarms allows us to work on the full stack and to make the Crazyflie a better drone for everybody.

Bitcraze will exhibit at the European Robotics Forum 2026 March 23-27 in booth #90, where we will demonstrate a live, autonomous indoor flight setup based on the CrazyflieTM platform. The demonstration features multiple nano-drones flying autonomously in a controlled environment and reflects how the platform is used in research and applied robotics development.

Why Indoor Aaerial Testbeds Matter

The purpose of the demonstration is not the flight itself, but the role such setups play in validating aerial robotics concepts. Indoor, small-scale aerial systems allow researchers and R&D teams to study autonomy, perception, control, and multi-robot coordination under safe and repeatable conditions. This makes it possible to explore system behavior, test assumptions, and iterate rapidly before moving to larger platforms or less controlled environments.

Applicable in Both Academia and Industrial R&D

Bitcraze is used both in academic research and in industrial R&D contexts. In academia, the platform supports experimental work in areas such as swarm robotics, learning-based control, and human–robot interaction, and has been referenced in hundreds of peer-reviewed research papers worldwide. In industry, similar setups are increasingly used as testbeds to de-risk development by validating ideas indoors before scaling to outdoor testing, larger drones, or other robotic systems that require higher investment and operational complexity.

Hands-on Discussions at the Booth

At the booth, the live flight cage will be complemented by hands-on access to additional drones, expansion decks, and software tools. This allows for technical discussions around hardware architecture, sensing and positioning options, software stacks, and how different configurations support different research or development goals.

The Conversations We Are at ERF to Have

At ERF, Bitcraze is there to engage in conversations about platforms, testbeds, and how ambitious aerial robotics ideas can be validated in a financially responsible, safe, and controlled manner. This includes discussions with academic groups, industrial R&D teams, and project partners working across the research-to-application spectrum.

Looking forward to the discussions in Stavanger in booth #90!

Send us a message to contact@bitcraze.io to book a meeting at the show!

The CrazyflieTM Color LED deck is a high-powered, fully programmable RGB(W) lighting expansion for the Crazyflie 2.x platform —and it’s now available in the shop.

It delivers bright, diffused, and uniform light suitable for research, teaching, vision experiments, and indoor drone choreography. The deck mounts on top or bottom of the Crazyflie and integrates seamlessly through our open-source firmware and I²C-based deck interface.

Two Versions, Same Electronics

The Color LED deck is available in two distinct versions, each sold as a separate product.

Top-Mounted Color LED Deck

Designed to be placed on top of the Crazyflie. Ideal for scenarios where the drone is viewed from above or when customers need to use positioning or sensor decks underneath the Crazyflie—such as the Flow Deck or other bottom-mounted modules.

  • Works well with motion capture (MoCap) systems using ceiling-mounted cameras.
  • Not recommended with the Lighthouse positioning deck, due to optical occlusion and interference with the lighthouse sensors.

You can find the top-mounted version in the store here

Bottom-Mounted Color LED Deck

This version is suitable for almost all use cases, offering maximum visibility from below and minimal interference with other decks.

  • Ideal for Lighthouse positioning (no optical obstruction).
  • Ideal for MoCap positioning, especially when cameras view from multiple angles

You can find the bottom-mounted version in the store here.

Dual Mounting for Maximum Visibility

Additionally, two Color LED decks can be mounted both above and underneath the Crazyflie simultaneously, creating a strong, uniform light signature visible from all directions. This is best for MoCap environments where multi-angle visibility improves marker/camera performance.

You can see all three variants of the Color LED in action in our latest Christmas video, created in collaboration with Learning Systems and Robotics Lab :

Each color LED deck variant operates independently, allowing the top and bottom decks to be configured with different colors if desired. While all variants share the same electronics, diffuser, and firmware behavior, their physical mounting positions let you choose the setup that best fits your lab, show environment, or positioning needs.

Diffuser now available

While designing the Color LED deck, we also created a light diffuser—now available in the shop to buy as a standalone product. It is designed to be compatible with the LED ring to spread and soften the light, extending visibility and improving appearance.

You can find the light diffuser in the store here.

The Color LED deck is available now in both versions. Head to the store to order, or contact us for a quote!

Today, we rejoin with Maurice Zemp who presented his work in an earlier blogpost.

Road to the Finals

I had officially completed my Matura thesis in October 2024 and submitted it to the Schweizer Jugend forscht competition. When I was selected for the semifinals, I was given the chance to present my work in front of a jury. Their feedback was highly constructive and came with clear requirements: for the finals, I would need to provide more in-depth analyses of the individual system components of my project. At first, this felt like a challenge, but in the process I realized how much these refinements elevated my research. By the time the finals approached, I felt both nervous and proud, knowing that the work I would present had grown far beyond the version I had initially submitted. On April 24, 2025, the big moment finally arrived – the start of the national finals.

Fig. 8: The Location of the semifinals

Day 1

The day began with my journey to ETH Zurich. Traveling by public transport, I carried my Crazyflie drone and the racing gate with me – equipment that had accompanied me throughout countless hours of development and testing and with which I wanted to make the comprehension of my project a bit more feasible. Arriving at ETH, I was greeted warmly at the reception, where I first felt a sense of belonging among dozens of passionate and curious young scientists.

Fig. 9: My booth at ETH

The morning was dedicated to setting up our booths. Piece by piece, the exhibition hall transformed into a vibrant space filled with prototypes, posters, and creative ideas. Once my own stand was ready, I finally had a moment to take in the atmosphere and to start the first conversations. In the afternoon, we were treated to a guided city tour through Zurich. Walking through the old streets, hearing stories about the city, and enjoying the fresh air was the perfect opportunity to get to know the other participants better.

Fig. 10: The Limmat (River in Zurich)
Fig. 11: The Grossmünster church in Zurich

Later that day, alumni of Schweizer Jugend forscht visited the exhibition. For the first time, I had the chance to present my project outside of the jury context, and I was surprised by the interest and thoughtful questions I received.

By the time we arrived at our youth hostel late in the evening, the excitement of the day had fully caught up with me. Exhausted but exhilarated, I fell into bed.

Day 2

The second day began with breakfast at the youth hostel, followed by a short tram ride back to ETH Zurich. The morning program was dedicated to the jury sessions, which represented one of the most important parts of the entire competition. Unlike in the semifinals, where I just explained my project and was asked some general questions, this time I was able to discuss my project in detail with several experts – including those specializing in fields beyond my own topic.

Fig. 12: The ETH Main Hall filled with interesting projects

These conversations quickly turned into fascinating discussions. The jurors asked insightful questions, challenged certain assumptions, and encouraged me to think more deeply about the potential applications of my work. At the same time, I received a great deal of praise, which was both reassuring and motivating. It was incredibly rewarding to see that months of effort, refinement, and problem-solving were being recognized by experienced professionals.

In the afternoon, the doors of ETH opened to the public for the exhibition. Friends, family members, and curious visitors from outside the competition came to explore the stands. Presenting my project in this setting felt very different from the formal jury discussions of the morning – it was more relaxed, conversational, and filled with spontaneous questions. I especially enjoyed seeing how people unfamiliar with drone technology reacted to the project, and it gave me the chance to practice explaining complex ideas in a way that was accessible to everyone. After such a full day of interactions, we returned to the youth hostel in the evening. The atmosphere there was much calmer, as everyone tried to recharge a little energy in preparation for the final day.

Fig. 13: Me explaining my project to a friend of mine who came to visit me

Day 3

The final day once again began with our journey to ETH Zurich. In the morning, the exhibition hall opened its doors for a second round of public visits. This time, the experience was especially meaningful for me, as my family came to see my project in person.

After lunch, it was finally time for the highlight of the competition: the award ceremony. A live band set the stage, and soon the opening speeches began. The tension in the room was almost tangible – every participant knew that months, if not years, of work were culminating in this single event. I felt both nervous and excited, my heart beating faster with each passing moment.

Fig. 14: The start of the award ceremony
Fig. 15: My nomination to ISEF 2026!

Then came an unexpected twist: even before the regular prizes and certificates were announced, the jury revealed the winners of the most prestigious special awards. To my immense joy, my name was called. I had been selected to represent Switzerland at the International Science and Engineering Fair (ISEF) 2026 in Phoenix, Arizona. The sense of relief, excitement, and pride I felt in that moment is difficult to describe – it was a dream come true.

The ceremony continued with an inspiring keynote by former NASA Director Thomas Zurbuchen, who shared his journey in science and reminded us of the importance of perseverance and never giving up.

Fig. 16: An inspring talk by Thomas Zurbuchen
Fig. 17: Me being awarded my distinction

Finally, the time came for the official certificates. One by one, every participant was called to receive their recognition. When my turn came, I was awarded the highest possible distinction: hervorragend (outstanding) honored with CHF 1500. The applause and congratulations that followed made the moment even more unforgettable.

The evening concluded with an apéro, where I had the chance to exchange thoughts with professors, fellow participants, and many guests. I was overwhelmed by the warm words of encouragement and congratulations I received for both my project and the recognition it had achieved.

After three exciting, inspiring, and at times exhausting days, it was finally time to return home – this time together with my parents, carrying not only my luggage but also an experience that I will cherish for a lifetime.

Last week, Bitcraze attended the BETT Show in London to get a better sense of how the education landscape is evolving.

BETT (British Educational Technology Show) brings together educators, edtech companies, curriculum developers, policymakers, and technology providers across the full spectrum of learning: from primary school to higher education and professional training.

For us, it was a valuable opportunity to listen and get an understanding of where the general EDU landscape is and where it is heading.

Meeting Familiar Faces, and New Ones

One of the most rewarding parts of the visit was reconnecting with existing partners already using the CrazyflieTM in educational settings, and meeting new potential collaborators: teachers building robotics programs, universities modernising their lab infrastructure, and organisations developing national STEM (Science, Technology, Engineering, and Mathematics) and STEAM (Science, Technology, Engineering, Art, and Mathematics) initiatives.

A recurring theme in many conversations was the need for platforms that are robust and safe to use in classrooms, scale from simple programming exercises to advanced autonomy and AI, support both structured teaching and open-ended experimentation, and are well documented (both for the teacher and for the student).

These are exactly the problems we have spent more than a decade working on.

What the Education Robotics Market Looks Like Today

Speaking with a wide range of robotics vendors, software providers, and solution integrators gave us a clearer picture of the realities of the K-12 and STEM market:

  • Procurement is often tender-based and highly structured
  • Budgets are tight and price sensitivity is real
  • There are many vendors offering similar-looking robotics kits
  • Hardware is physically robust and classroom-proof and safety is critical
  • Programming is dominated by Python, Scratch, Blockly, or proprietary visual tools
  • “AI-enabled” frequently means GPT-style programming blocks layered on top
  • LEGO compatibility is everywhere
  • micro:bit has effectively become a compelling entry-level control board
  • Buyers apply hard scrutiny to educational value and learning outcomes
  • Real adoption requires curricula, lesson plans, and teacher training programs
  • And in practice, U.S.-developed curricula often transfer reasonably well globally

Why the Crazyflie is a Great Fit for Education

Although the Crazyflie originated as a research platform, its characteristics map naturally to education:

STEM / STEAM (Upper Secondary & High School)

Students can work hands-on with control systems, sensors, wireless communication, programming, and basic AI in a physical system they can see, debug, and iterate on. It makes abstract concepts tangible.

Undergraduate Education

Crazyflie is increasingly used in robotics, embedded systems, and mechatronics courses to teach estimation, control, perception, and multi-agent systems without the overhead of large and expensive hardware.

Post-graduate Research

This remains our strongest domain: swarm robotics, learning-based control, human–robot interaction, indoor navigation, and distributed systems.

The continuity matters. Students don’t outgrow the platform. They grow with it. And, more importantly, the same openness that researchers value is increasingly relevant in education as well (particularly relevant in the light of recent geopolitical movements). Institutions want transparency, long-term maintainability, and the freedom to adapt tools to their pedagogy and not just consume closed kits.

Education is a Strategic Part of the Robotics Ecosystem

BETT confirmed that education is a strategic and structured part of the robotics ecosystem. Not just as “learning about robots”, but as a way to train future engineers, researchers, and system designers using realistic platforms from an early age.

Succeeding in this segment requires more than good hardware. It requires thoughtful packaging, clear educational positioning, proper teaching material, partner ecosystems, and long-term commitment.

To those we met at BETT, thank you for the conversations. And if you are working with STEM, STEAM, or robotics education and are curious about the Crazyflie, we are always happy to talk.

Today, we welcome our first blog post by Maurice Zemp. Stay tuned for more of his adventures later!

When I started working on my Matura thesis (a mandatory project in Swiss High School), I wanted to create something that went beyond a purely theoretical project. I was fascinated by the idea of combining cutting-edge technology with a very tangible and exciting challenge: making a small drone fly through a racing course, composed by small gates, completely on its own and as fast as possible. Inspired by the work under Prof. Davide Scaramuzza, I set off for a challenge to find some new alternatives or improvements, while developping my approach from scratch.

What may sound simple at first quickly turns into a highly complex task. A drone needs to perceive its environment, process information in real time, and decide on precise actions within fractions of a second, all without human intervention. Professional drone racing pilots train for years to master this level of control. My goal was to see whether artificial intelligence could achieve something similar, using reinforcement learning as the core technique.

But there was another challenge: I wanted to design a system that wasn’t only powerful, but also affordable and reproducible. Many research institutions use equipment worth tens of thousands of francs for projects like these. I asked myself: Could I build something comparable with a fraction of the budget, and still push the boundaries of what’s possible?

That question became the driving force behind my project, which later brought me all the way to the finals of Schweizer Jugend forscht (short SJf) and saved me a place as the main prize to represent Switzerland at the world’s biggest Youth Science Competition, ISEF 2026. Over the following sections, I’ll share how I built my system step by step, what it was like to present it at the competition, and how it felt when all the effort finally paid off.

Fig. 1: Me with the Crazyflie 2.1 Brushless in the halls of ETH Zurich, where the main event of SJf took place.

Motivation and Objectives

The project I presented at Schweizer Jugend forscht was the result of my Matura thesis, in which I set out to combine my interests in drones, programming, and artificial intelligence. My goal was to develop a complete system for autonomous drone racing, based on Reinforcement Learning (RL), that would not only work in simulation but could also be transferred to real-world conditions.

To achieve this, I focused on three key aspects:

  1. Building a highly efficient simulation environment for training a reinforcement learning agent.
  2. Developing a cost-effective motion-capture system (MoCap) capable of tracking a drone’s position and orientation in real time with high precision.
  3. Integrating both systems in a way that would allow seamless transfer from simulation to real-world experiments with minimal latency.

This combination made the project unique: instead of relying on expensive commercial hardware, I set out to create a solution that would be precise and affordable but still continue the state-of-the-art development in Drone Racing.

Simulation Environment

The simulation was implemented in Python, using Stable Baselines, OpenAI Gymnasium, and NumPy, accelerated with Numba for performance. At its core, the system employed the Proximal Policy Optimization (PPO) algorithm, a state-of-the-art reinforcement learning algorithm known for stability and efficiency.

Unlike general-purpose simulators such as Gazebo, my environment was designed specifically for drone racing. It could process tens of thousands of interactions per second, enabling a training run of a few dozen minutes to correspond to nearly a year of simulated flight time.

Key features included:

  • A physics-based flight dynamics model accounting for thrust, drag and gyroscopic effects.
  • A carefully engineered reward function balancing speed, precision and avoiding shortcuts.
  • A flexible design that allowed different gate sequences and drone parameters to be tested.
  • A multi-agent environment computing on multiple threads, leading to much shorter training time needed.
Fig. 2: Path of the drone after two hours of training on a given track

With this setup, an RL agent could learn to complete arbitrary racing tracks in near-optimal time1 after only a few hours of training. In simulation, top speeds of up to 100 km/h were achieved, though these exceeded the physical limits of the real drone and were generated with modified drone parameters.

Motion-Capture-System

A second cornerstone of the project was the development of a low-cost motion-capture system. Instead of relying on high-end solutions such as VICON or OptiTrack (which can cost tens of thousands of Swiss francs), I built a custom setup. The drone – a Crazyflie 2.1 Nanocopter (later a Crazyflie 2.1 Brushless) – was fitted with infrared diodes. With four cameras capturing at 120 frames per second, the drone’s position was calculated in real time through triangulation. By using three diodes arranged on the drone, I not only wanted to estimate the position but also the orientation. Unfortunately, due to the cameras being budget and thus not having a high-end resolution, the estimation of the orientation was not feasible and was therefore taken from the onboard IMU.

Fig. 3: Motion-Capture-System Concept

System Integration

For integration, I relied on the ROS2 middleware and the Crazyswarm2 framework, which allowed the simulation and MoCap data to be processed together with minimal latency. This setup ensured that a policy trained in simulation could be executed on the real drone almost seamlessly.

Fig. 4: Sim2Real Integration Concept

Results & Discussion

The results demonstrated the effectiveness of the combined system:

  • In simulation, the RL agent completed tracks in nearly optimal time1, demonstrating robust generalization across different gate sequences.
  • The motion-capture system delivered millimeter-level accuracy and reliable tracking in real time despite its simplicity. (see Fig. 5)
  • In real-world tests, the Crazyflie drone successfully completed tight gate sequences, even at speeds of up to 25 km/h, with a positional deviation of only 5–12 cm compared to the simulated trajectories. (see Fig. 6)

Given that the gates were roughly A3-sized openings (38 × 29 cm), the precision was sufficient to consistently hit every gate, confirming the feasibility of transferring simulated training into real-world racing. (Video)

Fig. 5: Boxplot of the deviation for collected points on a plane in millimeters
Fig. 6: Tracked Real Life Path on a given track
Fig. 7: Realisation in my school’s physics laboratory

The system demonstrated that RL-based controllers can generalize effectively from simulation to reality, even with imperfect models. This robustness shows that RL is promising for real-world robotics applications where exact physical modeling is difficult or costly. To further investigate this robustness, I conducted ablation studies in simulation focusing on aerodynamic drag, as well as analyses of the MoCap system’s accuracy in measuring dynamic motion.

  • As for the aerodynamic drag, I introduced some randomness during evaluation, which deviated largely from what the Algorithm had trained upon. To achieve this, I introduced a parameter k which was multiplied with the correct aerodynamic drag to view how these differences would impact the flight. In Fig. 8 you can see the deviation from the real path over the course of a flight. As obviously more drag results in lower velocities, the two paths had to be fitted using the nearest two possible points. The results are impressive, even with an enormous change to training such as k = 10 the flight still performed somewhat okay, even though the gates probably wouldn’t have been hit anymore.
  • For the MoCap System, I used a mathematically representable motion, such as a pendulum. Then I performed a parameter optimization on this mathematical model and measured the deviation based on the current speed. There is a clear correlation between speed and deviation (see Fig. 9), which most likely was caused by the missing calibration of the cameras.
Fig. 8: Results of ablation studies on aerodynamic drag
Fig. 9: Experiment to confirm correlation between deviation and speed

Conclusion and Reflection

This work demonstrates that RL, combined with accessible hardware, enables precise and robust autonomous flight in dynamic indoor environments. The results underscore the potential of low-
cost robotics solutions to democratize drone research. Even though it was a large project and there were some hard times, I enjoyed working on it a lot and believe that the result and the memories made with it are even more rewarding than any prize money!

Acknowledgments

To end this guest post, I want to sincerely thank Bitcraze for their amazing work and support during the development of this project. They have genuinely built such an incredible testbed for research in autonomous drones, it’s amazing! Without them, this project wouldn’t have worked out the way it did!

  1. Near-optimal time here references to the theoretical boundary of a drone completing this track, given it’s parameters. To be clear, this is not fitting a polynomial with boundaries on their derivatives onto gate segments, as it was done during earlier approches in the 2010s. It was compared to multiple previous approches by UZH RPG such as Optimal Control (OC). Nevertheless, this comparison could be done more extensively in the future, as time ran short in the end. ↩︎

Since the end of 2024, we’ve been putting effort into spreading out our manufacturing. With international trade rules rapidly changing, it felt like the right moment to expand our production footprint. Doing this helps us keep stock more stable, react faster to demand, and makes life easier for you when it comes to potential import-related costs.

To make it happen, we’ve been working closely with our long-time partner in China, Seeed Studio. They’ve been helping us move the production of some of our items to Vietnam, where exciting new opportunities have opened up. This way, we can keep the same quality and reliability you’re used to while spreading out production across more locations, which makes our supply chain stronger.

Right now, four of our products are being made in Vietnam: the Crazyflie 2.1+, the Crazyflie 2.1 Brushless, the Flow Deck, and the Crazyradio 2.0. Meanwhile, the Charging Dock is made here in Sweden, and the Lighthouse Base Station comes from Taiwan. That means our production is now spread over four different locations!

We still produce in China as well—that’s where our newest deck will come from, for example. The plan is to gradually add more products made in Vietnam, spreading production across locations, reduce risk, and keep things running smoothly for both us and you. Over time, this will make it easier to maintain stock, respond quickly to demand, and give you a smoother experience no matter where you are in the world.

We also want to make it visible where your products come from when you shop with us. In May, we updated the store to clearly display the country of origin for each item. You can now find this information at the bottom of every product page, so you always know where the item in your cart is being made. For many of you, this small detail helps plan ahead and makes it easier to estimate any extra costs from international shipping.

It’s been a while since our last update on what started as the High powered LED deck prototype. We have finally had time to push this project forward and are aiming to have a release at the beginning of 2026.

A New Name and a Familiar Design

You might notice that the deck has a new name, something simpler and a bit catchier, the Color LED deck (bottom-mounted and top-mounted). The overall design and specs, however, remain very similar to the original concept:

  • Using a highly efficient high powered LED for maximum brightness
  • DC/DC driving circuitry for improved efficiency and consistent performance
  • A light diffuser for smooth, even illumination and wide visibility
  • Two versions, top or bottom mounted, depending on your build

The Color LED Deck brings fully programmable lighting to your Crazyflie, allowing you to create and control custom light patterns in real time. It’s useful for flying in darker environments, for visual tracking experiments, or for adding synchronized light effects in drone choreography. The deck is now also compatible with the Crazyflie 2.1 Brushless, bringing dynamic lighting to our most recent platform for the first time.

Software architecture

This deck will also be the first to use the new DeckCtrl architecture. If you’re curious about how that works, you can read more about it in this earlier blog post.

The Color LED deck has some intelligence built into it that runs on a STM32C0 MCU. The open-source firmware is still under development, and the repository can be found here.

Availability

The final pricing is still being determined, but make sure to sign up for the in-stock notification at the Color LED deck store pages (bottom-mounted and top-mounted) to get an update as soon as it’s available. And as always, keep an eye on the blog for more updates as we get closer to release.

When flying the Crazyflie Brushless, you may have noticed something familiar, as the battery drains, the drone becomes less responsive and can not generate the same amount of thrust it had at the start of the flight. This is because as the state of charge drops, the battery voltage decreases, and that directly affects the thrust output of the motors.

We wanted to fix this. In this post, we’ll explain why the old compensation method wasn’t ideal, how we used system identification to design a new battery compensation scheme, and how this improves thrust consistency across the entire battery range.

Motivation

The key problem is simple: a dropping battery voltage means a dropping thrust for the same command. This leads to flights that start crisp and responsive but is reduced as the battery drain.

Our goal was to make sure that, regardless of the state of charge, the actual thrust stays close to the commanded thrust. Though, for manual flight, sometimes this might not be preferred, so there will be an option to turn it off.

To illustrate the setup, here is a schematic of how the battery, PWM switching, and motor interact, effectively behaving like a simple buck converter:

This means the motor voltage can be computed by:

System Identification

To design a proper compensation, we first needed to understand how thrust relates to motor voltage. This meant running a series of experiments on the thrust stand introduced in this earlier blog post.

The first step was calibrating the loadcell used to measure thrust:

Once calibrated, we measured the thrust at different applied motor voltages.

As expected, the thrust can be modeled well by a third-order polynomial in motor voltage.

Mathematically, the relationship comes from two simple facts:

  • A DC motor torque is proportional to motor voltage and inversely related to motor speed.
  • A propeller’s thrust scales approximately with the square of the rotational speed.

Combining these effects leads to a nonlinear (third-order) relation between motor voltage and thrust.

Battery Compensation

The main idea is straightforward: instead of assuming the battery voltage is constant, we explicitly account for it. We can measure the battery voltage and low-pass filter it to reduce noise. Together with the necessary motor voltage from the curve above, we can solve the equation from above for the necessary pwm to apply:

This corrected motor voltage is then fed into our third-order model to compensate the thrust command. With this compensation, the commanded-to-actual thrust relation is now approximately linear, which is exactly what we want. We can verify this by applying thrust commands and comparing them to the actual thrust.

Dynamic Behavior

To obtain a complete parameter set of the motors and propellers, we also performed dynamic tests: commanding rapid increases and decreases in PWM and measuring the thrust response.

These dynamics are not required for the battery compensation itself, but they are very useful for accurate system identification and for simulation purposes.

Discussion and Conclusion

The new compensation method (#PR1526), ensures that thrust is consistent across the full range of battery charge. Compared to the old approach, it is both simpler to understand and more faithful to the actual physics of the motor–propeller system. The result is flights that feel the same at the end of the battery as at the beginning.

Beyond just improving flight performance, the system identification work also provides us with a full parameter set for the Crazyflie. We are already using these parameters in Crazyflow, our new simulation tool that models the Crazyflie dynamics with high fidelity. If you’re interested in simulation or in testing new control strategies virtually, check it out!

We’re excited to hear feedback from the community and to see what you do with this new capability.

We’re happy to announce that release 2025.09 is now available. This update includes Python 3.13 support and improved battery voltage compensation, along with enhanced trajectory control capabilities. The release also features substantial stability and architectural work that lays the groundwork for upcoming features and products. Thanks to our community contributors for their valuable additions to this release.

Major changes

Python 3.13 support
Added support for Python 3.13, with preparations already in place to ensure faster compatibility when Python 3.14 releases.

Improved battery voltage compensation
Enhanced voltage compensation for Crazyflie 2.1, 2.1+, and 2.1 with thrust upgrade kit provides more consistent flight performance across battery discharge cycles. Crazyflie Brushless battery voltage compensation coming soon. Thanks to @Rather1337 for this contribution.

Manual flight setpoint packet
New generic commander packet type for manual flight that allows dynamic switching between Angle and Rate modes for roll and pitch without re-flashing the firmware.

Fully relative trajectories
Trajectories can now be initiated relative to the drone’s current yaw, not just position. This enables fully relative trajectory execution where subsequent trajectories maintain the drone’s current orientation. Thanks to @johnnwallace for this contribution.

Modular deck discovery architecture
Replaced OneWire-only deck discovery with a modular backend system that supports multiple discovery protocols. This enables future deck communication methods while maintaining full backward compatibility with existing decks.

Release overview

crazyflie-firmware release 2025.09 GitHub

cfclient (crazyflie-clients-python) release 2025.9 GitHub, PyPi

cflib (crazyflie-lib-python) release 0.1.29 GitHub, PyPi