Category: Crazyflie

During my first Fun Friday as a Bitcraze intern in 2021, I discovered the musical note definitions in the Crazyflie firmware and thought about creating a musical performance using the Crazyflie’s motors, but never followed through.

A few weeks ago I decided to finally take it on as a Fun Friday Project with the slightly more ambitious goal of playing music across several Crazyflies at once.

crazyflie-jukebox takes a MIDI file, preprocesses it into motor frequency events, then uploads and plays the song by spinning the motors accordingly. Each Crazyflie contributes 4 voices (one per motor), so polyphony scales directly with your drone count.

I implemented this as a firmware app and a Python script using the work-in-progress cflib2 (running on Rust lib back-end). You can find the repository here, try it for yourself! Be aware that certain note combinations can cause the Crazyflie to move, flip, or take off unexpectedly.

Fitting music into 4 motors

The pipeline starts by parsing the MIDI file with mido. From there, an interactive track selection step shows you the instrument names, note counts, and ranges for each track so you can pick exactly which ones to include. The selected notes are then converted from MIDI note numbers into Hz frequencies that the motors can work with.

Each Crazyflie can only play 4 simultaneous voices (one per motor) so there’s some work involved in squeezing music into that constraint. I implemented a couple different voice allocation strategies: melodic priority, which keeps the bass and melody prioritized; voice stealing, which works like a LRU synth; and a simple round robin, which just assigns each new note to the next motor in turn, cutting off whatever was playing there. There’s also a frequency range problem to deal with: motors only reliably produce pitches in roughly the C4–B7 range, so notes outside that window get octave-shifted to fit.

Upload protocol

Events are packed into compact 6-byte structs containing a delta timestamp, motor index, an on/off flag, and the target frequency. These get streamed to the firmware app using the app channel in a simple START; events; END sequence. The Crazyflie app has a buffer limit of 5000 events, which effectively caps the length and complexity of what you can play. The 5000-event buffer was an arbitrary choice and you could probably get away with more, but it was enough for most songs I threw at it.

Synchronization

One of the trickier elements of this project was keeping Crazyflies synchronized. For starting in sync, I didn’t do anything special: no broadcast, no t-minus countdown, just sending start commands to each drone in sequence and relying on cflib2 to do it fast enough that the delay is negligible. That said, I’ve only tested with a small number of Crazyflies. With a larger fleet you’d probably need to implement something for the initial sync.

The real challenge is drift over time. The STM32’s crystal is rated at around 0.1% tolerance. This sounds tiny, but in the worst case, over a 1-minute song that’s already ~120 ms of drift between two drones. In a musical context, humans start noticing timing offsets around 20-30 ms; less for percussive sounds, and less for trained musicians. So left uncorrected, drift would become very audible well before the song ends.

To fix this, all clocks are reset to zero at song start. The host then periodically sends resync packets containing its own timestamp in microseconds, and each Crazyflie applies an offset correction to stay aligned, which as a bonus also irons out any initial start latency.

Rough edges

The biggest design constraint is that a single track can’t be split across Crazyflies, so if a track has more than 4 simultaneous voices, some get dropped. I thought of each Crazyflie as its own instrument, which made sense at the time, but it does mean a dense MIDI tracks can’t be split across multiple drones, which feels limiting in hindsight.

The usable pitch range is about 4 octaves (C4–B7), and propellers need to be attached for accurate pitch since the motors need load to produce the right frequencies, which makes the whole thing a bit unsafe. Certain note combinations can cause a drone to move, flip, or behave unpredictably. Only brushed motors are supported, and there’s a hard 71-minute per-song limit on clock sync. But honestly, if you’re sitting there listening to a 71-minute song on your Crazyflie, the clock drift is the least of your problems.

Check out crazyflie-jukebox on GitHub

This week we wanted to reflect on the progress that has been made lately in the Crazyflie ecosystem which will lead to bigger and better Crazyflie Swarms.

Radio communication

Like pointed out in the last blog post about Building a Crazyflie Flower Swarm with Rust, the new Rust Crazyflie library together with the new Crazyradio 2.0 has improved connection time and link efficiency by quite a bit.

It is now possible to connect swarms of multiple dozens of Crazyflies in seconds using a single radio and then make them fly while still getting position telemetry. So many Crazyflie on one radio does limit the maximum bandwidth per Crazyflie, but it does now work in a stable way!

Color LED deck

The recently released Color LED deck is a great addition to the ecosystem towards swarm. Its predecessor, the Led-ring Deck, has been used a lot by researchers to indicate state of individual Crazyflies in a Swarm. The Color LED Deck improves on that by providing a diffuser that allows to see the color from the side. This allows to mark states of big groups of Crazyflie much more clearly.

As a bonus, the Color LED Deck is very usable in other field like art and shows since it is much more visible and can be used to fly Crazyflies as “Flying Pixels”.

Autonomous landing and charging

Last year, we have released a Crazyflie 2.1 Brushless charging dock. This is a produced version of an idea we have been using with Crazyflie 2.1 and the Qi deck for years at fairs and conferences. It allows Crazyflies to autonomously land and charge. It is not only great for autonomous drone demos and shows but it also is a great waiting spots for swarms when doing research: the charging dock keeps the swarm charged so that when it is time to take off all the individuals starts with the same battery level.

Future endeavors

On the radio side there are still areas that would bring great improvement on communication stability. We are for example working on a channel-hopping communication protocol that should make the connection mostly immune to regular interference on 2.4GHz.

We are also working at improving other parts of swarm management, this includes for example solving the problem of flashing a full swarm of Crazyflie with the same firmware: we may be able to use broadcast messages more in order to drastically speed up the process instead of flashing the Crazyflie one per one.

Overall, working on bigger swarms allows us to work on the full stack and to make the Crazyflie a better drone for everybody.

With spring just around the corner, we thought it was the perfect excuse to make our Crazyflies bloom. The result is a small swarm demo where each drone flies a 3D flower trajectory, all coordinated from a single Crazyradio 2.0. This blog post walks through how it works and highlights two things that made it possible: the new Color LED deck and the Crazyflie Rust library.

The Color LED deck

There are two Color LED decks for the Crazyflie – one mounted on top and one on the bottom – each with its own individually controllable LED via the standard parameter interface. In this demo we use the one mounted on the bottom to give color to the flowers, along with the Lighthouse deck for accurate positioning in space.

The deck opens up a lot of creative possibilities for swarm demos as well as clear visual feedback about what each drone is doing.

Fast swarm connections with the Crazyflie Rust library

Getting five drones connected quickly on a single Crazyradio used to be a real bottleneck. The Crazyflie Rust library introduces a lazy-loading parameter system. Parameter values are not downloaded at connect time; instead, they are fetched only if the API user explicitly accesses them.

Additionally, caching the TOC (Table of Contents) makes it trivial to persist it locally and reuse it on every subsequent connection. In practice this means that after connecting to each drone once, all future connections are nearly instantaneous. The cache is keyed by the TOC’s CRC32 checksum, so it automatically stays valid as long as the firmware doesn’t change, and it’s identical between drones with the same checksum.

The library also uses Tokio’s async runtime, which means all Crazyflie connections start at the same time without waiting for each other. Combined with generally higher communication performance in the Rust implementation, these features significantly reduce the startup overhead, making the swarm feel reliable and responsive, which would require much more effort with the current Python library.

Generating the trajectories

The flower shapes are generated in Python using this script. It produces two .json files per drone (one stem{n} and one petals{n}) containing all the waypoints to fly through. The trajectories are then uploaded to the drone as compressed poly4d segments, a compact format that the Crazyflie’s onboard high-level commander can execute autonomously. Both trajectories are expressed relative to each drone’s initial position, so the formation geometry is entirely determined by where you place the drones on the ground before takeoff.

Putting them all together

The flight sequence is pretty straightforward:

1. Build the trajectories as waypoints on the host.

2. Connect to all drones simultaneously.

3. Upload each drone’s compressed trajectories in parallel.

4. Fly the trajectories while switching the LED colors.

Everything after the connection is driven by Tokio’s join_all, so the swarm stays in sync without any explicit synchronization logic – the drones are just given the same commands at the same time.

The full source code is available at this repository (Python for trajectory generation, Rust for flying).

We’re excited about where the Rust library is heading. It’s improving the communication with the Crazyflie and allows us to increase dramatically the number of Crazyflies per Crazyradio, leading to bigger and more reliable swarms. If you build something cool with it, let us know!

The CrazyflieTM Color LED deck is a high-powered, fully programmable RGB(W) lighting expansion for the Crazyflie 2.x platform —and it’s now available in the shop.

It delivers bright, diffused, and uniform light suitable for research, teaching, vision experiments, and indoor drone choreography. The deck mounts on top or bottom of the Crazyflie and integrates seamlessly through our open-source firmware and I²C-based deck interface.

Two Versions, Same Electronics

The Color LED deck is available in two distinct versions, each sold as a separate product.

Top-Mounted Color LED Deck

Designed to be placed on top of the Crazyflie. Ideal for scenarios where the drone is viewed from above or when customers need to use positioning or sensor decks underneath the Crazyflie—such as the Flow Deck or other bottom-mounted modules.

  • Works well with motion capture (MoCap) systems using ceiling-mounted cameras.
  • Not recommended with the Lighthouse positioning deck, due to optical occlusion and interference with the lighthouse sensors.

You can find the top-mounted version in the store here

Bottom-Mounted Color LED Deck

This version is suitable for almost all use cases, offering maximum visibility from below and minimal interference with other decks.

  • Ideal for Lighthouse positioning (no optical obstruction).
  • Ideal for MoCap positioning, especially when cameras view from multiple angles

You can find the bottom-mounted version in the store here.

Dual Mounting for Maximum Visibility

Additionally, two Color LED decks can be mounted both above and underneath the Crazyflie simultaneously, creating a strong, uniform light signature visible from all directions. This is best for MoCap environments where multi-angle visibility improves marker/camera performance.

You can see all three variants of the Color LED in action in our latest Christmas video, created in collaboration with Learning Systems and Robotics Lab :

Each color LED deck variant operates independently, allowing the top and bottom decks to be configured with different colors if desired. While all variants share the same electronics, diffuser, and firmware behavior, their physical mounting positions let you choose the setup that best fits your lab, show environment, or positioning needs.

Diffuser now available

While designing the Color LED deck, we also created a light diffuser—now available in the shop to buy as a standalone product. It is designed to be compatible with the LED ring to spread and soften the light, extending visibility and improving appearance.

You can find the light diffuser in the store here.

The Color LED deck is available now in both versions. Head to the store to order, or contact us for a quote!

Today, we welcome our first blog post by Maurice Zemp. Stay tuned for more of his adventures later!

When I started working on my Matura thesis (a mandatory project in Swiss High School), I wanted to create something that went beyond a purely theoretical project. I was fascinated by the idea of combining cutting-edge technology with a very tangible and exciting challenge: making a small drone fly through a racing course, composed by small gates, completely on its own and as fast as possible. Inspired by the work under Prof. Davide Scaramuzza, I set off for a challenge to find some new alternatives or improvements, while developping my approach from scratch.

What may sound simple at first quickly turns into a highly complex task. A drone needs to perceive its environment, process information in real time, and decide on precise actions within fractions of a second, all without human intervention. Professional drone racing pilots train for years to master this level of control. My goal was to see whether artificial intelligence could achieve something similar, using reinforcement learning as the core technique.

But there was another challenge: I wanted to design a system that wasn’t only powerful, but also affordable and reproducible. Many research institutions use equipment worth tens of thousands of francs for projects like these. I asked myself: Could I build something comparable with a fraction of the budget, and still push the boundaries of what’s possible?

That question became the driving force behind my project, which later brought me all the way to the finals of Schweizer Jugend forscht (short SJf) and saved me a place as the main prize to represent Switzerland at the world’s biggest Youth Science Competition, ISEF 2026. Over the following sections, I’ll share how I built my system step by step, what it was like to present it at the competition, and how it felt when all the effort finally paid off.

Fig. 1: Me with the Crazyflie 2.1 Brushless in the halls of ETH Zurich, where the main event of SJf took place.

Motivation and Objectives

The project I presented at Schweizer Jugend forscht was the result of my Matura thesis, in which I set out to combine my interests in drones, programming, and artificial intelligence. My goal was to develop a complete system for autonomous drone racing, based on Reinforcement Learning (RL), that would not only work in simulation but could also be transferred to real-world conditions.

To achieve this, I focused on three key aspects:

  1. Building a highly efficient simulation environment for training a reinforcement learning agent.
  2. Developing a cost-effective motion-capture system (MoCap) capable of tracking a drone’s position and orientation in real time with high precision.
  3. Integrating both systems in a way that would allow seamless transfer from simulation to real-world experiments with minimal latency.

This combination made the project unique: instead of relying on expensive commercial hardware, I set out to create a solution that would be precise and affordable but still continue the state-of-the-art development in Drone Racing.

Simulation Environment

The simulation was implemented in Python, using Stable Baselines, OpenAI Gymnasium, and NumPy, accelerated with Numba for performance. At its core, the system employed the Proximal Policy Optimization (PPO) algorithm, a state-of-the-art reinforcement learning algorithm known for stability and efficiency.

Unlike general-purpose simulators such as Gazebo, my environment was designed specifically for drone racing. It could process tens of thousands of interactions per second, enabling a training run of a few dozen minutes to correspond to nearly a year of simulated flight time.

Key features included:

  • A physics-based flight dynamics model accounting for thrust, drag and gyroscopic effects.
  • A carefully engineered reward function balancing speed, precision and avoiding shortcuts.
  • A flexible design that allowed different gate sequences and drone parameters to be tested.
  • A multi-agent environment computing on multiple threads, leading to much shorter training time needed.
Fig. 2: Path of the drone after two hours of training on a given track

With this setup, an RL agent could learn to complete arbitrary racing tracks in near-optimal time1 after only a few hours of training. In simulation, top speeds of up to 100 km/h were achieved, though these exceeded the physical limits of the real drone and were generated with modified drone parameters.

Motion-Capture-System

A second cornerstone of the project was the development of a low-cost motion-capture system. Instead of relying on high-end solutions such as VICON or OptiTrack (which can cost tens of thousands of Swiss francs), I built a custom setup. The drone – a Crazyflie 2.1 Nanocopter (later a Crazyflie 2.1 Brushless) – was fitted with infrared diodes. With four cameras capturing at 120 frames per second, the drone’s position was calculated in real time through triangulation. By using three diodes arranged on the drone, I not only wanted to estimate the position but also the orientation. Unfortunately, due to the cameras being budget and thus not having a high-end resolution, the estimation of the orientation was not feasible and was therefore taken from the onboard IMU.

Fig. 3: Motion-Capture-System Concept

System Integration

For integration, I relied on the ROS2 middleware and the Crazyswarm2 framework, which allowed the simulation and MoCap data to be processed together with minimal latency. This setup ensured that a policy trained in simulation could be executed on the real drone almost seamlessly.

Fig. 4: Sim2Real Integration Concept

Results & Discussion

The results demonstrated the effectiveness of the combined system:

  • In simulation, the RL agent completed tracks in nearly optimal time1, demonstrating robust generalization across different gate sequences.
  • The motion-capture system delivered millimeter-level accuracy and reliable tracking in real time despite its simplicity. (see Fig. 5)
  • In real-world tests, the Crazyflie drone successfully completed tight gate sequences, even at speeds of up to 25 km/h, with a positional deviation of only 5–12 cm compared to the simulated trajectories. (see Fig. 6)

Given that the gates were roughly A3-sized openings (38 × 29 cm), the precision was sufficient to consistently hit every gate, confirming the feasibility of transferring simulated training into real-world racing. (Video)

Fig. 5: Boxplot of the deviation for collected points on a plane in millimeters
Fig. 6: Tracked Real Life Path on a given track
Fig. 7: Realisation in my school’s physics laboratory

The system demonstrated that RL-based controllers can generalize effectively from simulation to reality, even with imperfect models. This robustness shows that RL is promising for real-world robotics applications where exact physical modeling is difficult or costly. To further investigate this robustness, I conducted ablation studies in simulation focusing on aerodynamic drag, as well as analyses of the MoCap system’s accuracy in measuring dynamic motion.

  • As for the aerodynamic drag, I introduced some randomness during evaluation, which deviated largely from what the Algorithm had trained upon. To achieve this, I introduced a parameter k which was multiplied with the correct aerodynamic drag to view how these differences would impact the flight. In Fig. 8 you can see the deviation from the real path over the course of a flight. As obviously more drag results in lower velocities, the two paths had to be fitted using the nearest two possible points. The results are impressive, even with an enormous change to training such as k = 10 the flight still performed somewhat okay, even though the gates probably wouldn’t have been hit anymore.
  • For the MoCap System, I used a mathematically representable motion, such as a pendulum. Then I performed a parameter optimization on this mathematical model and measured the deviation based on the current speed. There is a clear correlation between speed and deviation (see Fig. 9), which most likely was caused by the missing calibration of the cameras.
Fig. 8: Results of ablation studies on aerodynamic drag
Fig. 9: Experiment to confirm correlation between deviation and speed

Conclusion and Reflection

This work demonstrates that RL, combined with accessible hardware, enables precise and robust autonomous flight in dynamic indoor environments. The results underscore the potential of low-
cost robotics solutions to democratize drone research. Even though it was a large project and there were some hard times, I enjoyed working on it a lot and believe that the result and the memories made with it are even more rewarding than any prize money!

Acknowledgments

To end this guest post, I want to sincerely thank Bitcraze for their amazing work and support during the development of this project. They have genuinely built such an incredible testbed for research in autonomous drones, it’s amazing! Without them, this project wouldn’t have worked out the way it did!

  1. Near-optimal time here references to the theoretical boundary of a drone completing this track, given it’s parameters. To be clear, this is not fitting a polynomial with boundaries on their derivatives onto gate segments, as it was done during earlier approches in the 2010s. It was compared to multiple previous approches by UZH RPG such as Optimal Control (OC). Nevertheless, this comparison could be done more extensively in the future, as time ran short in the end. ↩︎

Time flies! 2025 is already drawing to a close, and as has become a Bitcraze tradition, it’s time to pause for a moment and look back at what we’ve been up to during the year. As usual, it’s been a mix of new hardware, major software improvements, community adventures, and some changes within the team itself.

Hardware

2025 started with a bang, with the release of the Brushless! It was a big milestone for us, and it’s been incredibly exciting to see how quickly the community picked it up and started pushing it in all sorts of creative directions. From more demanding research setups to experimental control approaches, the Brushless has already proven to be a powerful new addition to the ecosystem. Alongside the Brushless, we also released a charging dock enabling the drone to charge autonomously between flights. This opened the door to longer-running experiments and led to the launch of our Infinite Flight bundle, making it easier than ever to explore continuous operation and autonomous behaviors.

Beyond new products, we spent much of the year working on two major hardware-related projects. During the first part of the year, our focus was on expanding the Lighthouse system to support up to 16 basestations. The second part of the year much of our efforts shifted toward preparing the release of the Color LED, which we’re very much looking forward to seeing in the wild.

Fun Fridays also got some extra time in the spotlight this year. Aris produced a series of truly delightful projects that reminded us all why we love experimenting with flying robots in the first place. Have you seen the disco drone, or the claw?

Software

On the software side, 2025 brought some important structural improvements. The biggest change was the introduction of a new control deck architecture; which lays the groundwork for better identification and enumeration of decks going forward. This is one of those changes that may not look flashy on the surface, but it will make life easier for both users and developers in the long run.

We also made steady progress with the Rust Lib, moving from Fun Friday to a fully-fleshed tool. It is now an integrated and supported part of our tooling, opening up new possibilities for users who want strong guarantees around safety and performance.

Another long-awaited update this year was an upgrade to the Crazyradio, improving its ability to handle large swarms of Crazyflie™ drones. This directly benefits anyone working with multi-drone setups and helps make swarm experiments more robust and scalable.

Community

It’s fair to say that 2025 was a busy year on the community and events front. We kicked things off, as usual, at FOSDEM, where we hosted a dev room for the first time! A big step for us and a great opportunity to connect with fellow open-source developers.

Later in the year, we made our first trip back to the US since the pandemic. This included sponsoring the ICUAS competition, and hosting a booth at ICRA Atlanta, both of which were fantastic opportunities to meet researchers and practitioners working with aerial robotics. We also presented a brand-new demo at HRI.

In September, KTH hosted a “Drone gymnasium“, giving students hands-on access to Crazyflies and encouraging them to explore new robotic experiences. Seeing what happens when students are given open tools and creative freedom is always inspiring, and this event was no exception.

2025 was also marked by continued and valuable community contributions. From improvements and bug fixes to features like battery compensation, these contributions play a huge role in shaping the platform and pushing it forward.

Team

Behind the scenes, Bitcraze itself continued to grow. This year brought both change and expansion within the team. Tove moved on to new adventures in Stockholm, and while we’ll miss her, we were also happy to welcome a record four new team members!

Aris ansitioned from intern to full-time developer, Fredrik then joined as a growth and partnership guru, followed by Enya as an application engineer. Hampus was the last to join us as an administrator. With these new faces, the team is larger — and stronger — than ever.

All in all, 2025 has once again been an exciting, intense, and rewarding year for Bitcraze. Thank you to everyone in the community who flew with us, built on our tools, reported issues, shared ideas, and showed us what’s possible with tiny flying robots. We can’t wait to see what 2026 brings.

Recent work from the Learning Systems and Robotics Lab explores a question many of us have implicitly wrestled with:

How do we design expressive, coordinated swarm behavior without hand-crafting every trajectory?

Their answer, presented in the paper “SwarmGPT”, is to use large language models not as low-level controllers, but as a high-level interface for swarm intent, and then rely on classical robotics methods to make that intent executable and safe.

So what is SwarmGPT?

SwarmGPT is not about letting an LLM “fly drones”. Instead, it introduces a clear separation of responsibilities:

  • Language models operate at the level of structure, timing, and choreography
  • Motion planning and control handle feasibility, collision avoidance, and dynamics
  • Execution remains deterministic and verifiable

In practice, this means a user can specify swarm behavior using natural language (or music-derived cues), and the system generates structured multi-drone trajectories that are then validated and executed using established robotics pipelines.

This distinction matters. The paper does not replace control theory, but it compresses the path from idea to experiment. For researchers, this has several implications:

  • Faster iteration on swarm concepts
  • Lower barrier to expressive multi-agent behavior
  • A clean interface between creative intent and physical constraints

Making Swarm Structure Visible With the Crazyflie Color LED Deck

The authors used Crazyflie drones as research platform and we sent them a handful of Crazyflie Color LED Decks to experiment with.

The LED deck is not just decorative. It provides, per-agent visual feedback, clear indication of phase, grouping, or timing, and immediate insight into synchronization and coordination.

For research, this supports:

  • Real-time inspection of swarm state
  • Easier debugging of generated behavior
  • More legible demonstrations of complex coordination

For drone show–style applications, it enables:

  • Tight coupling between motion and light
  • Expressive patterns where choreography and illumination reinforce each other
  • Rapid iteration on visual concepts without custom hardware

The same capability serves both domains, which is part of its appeal.

Happy Robotic Holidays!

In our annual holiday video, nine Crazyflies take off from the Bitcraze HQ in Malmö. After briefly hovering to demonstrate the three mounting options of the Color LED Deck, they make their way to the Learning Systems and Robotics Lab in Munich.

There, they perform a SwarmGPT-generated choreography before slowly landing in a sparkling snowflake pattern.

When sandwich-mounted, the top- and bottom-mounted Color LED Decks can display independent colors and light patterns, enabling richer visual expressions and more nuanced feedback.

Happy Robotic Holidays from all of us!

Learn More

📄 SwarmGPT paper (Learning Systems and Robotics Lab)
https://ieeexplore.ieee.org/document/11197931

👉 SwarmGPT project website:
https://utiasdsl.github.io/swarm_GPT/

💡 Crazyflie Color LED Deck product page
https://www.bitcraze.io/products/color-led-deck/

🛒 Bitcraze store (Color LED Deck – available early 2026)
Crazyflie Color LED Deck – Top Mounted
Crazyflie Color LED Deck – Bottom Mounted

Christmas is getting close, and while most people are just starting to hang lights and decorate their tree, we decided to go a little bigger and a whole lot brighter. Instead of adding tinsels and ornaments, we set up a swarm of 8 Crazyflie 2.1 Brushless drones with the upcoming Color LED decks along with some long-exposure photography magic, and decorated our flying arena with a Christmas tree made of Crazyflies.

How it works

The project is split into two parts: the firmware side that controls the Color LED decks and the script that is responsible for the choreography of the swarm.

The Firmware

Instead of lighting the LEDs based on time or commands, each Crazyflie uses its 3D position to decide on the correct color. This makes the whole communication with the central computer easier. Inside the firmware, multiple virtual spheres are created in the flight arena, just like ornaments floating in a tree-shaped structure. Whenever a Crazyflie flies into one of these spheres, its Color LED deck switches instantly from green to red. When it flies back out, it glows green again. Since we’re taking a long-exposure photo, the whole color pattern begins when the drones are ready to perform the choreography and stops when they start landing.

The Script

The python script is pretty simple. It commands a swarm of Crazyflies to perform a coordinated spiral choreography resembling a Christmas tree outline in 3D space. Each drone takes off to a different height, flies in spiraling circular layers, and changes radius as it rises and descends, forming the visual structure of a cone when viewed from outside. To pull this off with the current state of the cflib, we used 3 Crazyradios 2.0 and the high level commander.

A Testbed for Crazyradio 2.0 Improvement

Lately, we have been looking again at improving the radio communication with the Crazyflie. A prototype featuring a new USB communication mode for the Crazyradio was ready just in time for testing with the Christmas tree demo.

This new mode makes Crazyradio 2.0 much more efficient when communicating with swarms. With it, we were able to fly the same demo using only one Crazyradio 2.0 instead of 3 with the connection time to the swarm greatly accelerated. This demonstrates the efficiency of the new mode.

The new mode is called “inline setting mode” since it works by inlining all radio settings with the packet data, negating the need to issue time-costly USB setup requests. It is currently a Pull Request to Crazyradio 2.0 and the Rust Crazyradio driver. Support for Crazyswarm/ROS and CFLib will be implemented and when we know that the protocol works out for all libs, we will merge and release support for the new mode. It will be enabled by default so you will get the benefits from upgrading the Crazyradio 2.0 firmware and lib. We will talk more about it when it is released, in the mean time do not hesitate to test and feedback on the PRs ;-).

Demo source code

You can find the project’s repository as well as the rust version on Github. The python version was used for the picture and video, and the Rust one behaves identically.

It’s always a good feeling to wrap up the week with a Fun Friday project – especially when it involves some questionable mechanical additions to a Crazyflie platform. This time, I decided to test the capabilities of the upcoming Color LED deck by turning it into a Disco deck.

Mechanics

The core of the Disco Deck is pretty simple: a 3D-printed disco ball mounted directly on top of the Color LED Deck with a couple of screws. To bring it to life, I added a Sub-Micro Plastic Planetary Gearmotor and used a rubber band as a drive belt to transfer the rotation. It’s a lightweight, low-tech solution that works surprisingly well with the Crazyflie 2.1 Brushless. All the structural parts were designed to be easily 3D printed in PLA, and they fit on a single print plate for a quick build. You can find all the part files here.

Electronics & Firmware

On my first attempt, I connected the motor directly to VCC and GND, which meant it started spinning as soon as the Crazyflie powered up. This turned out to be a problem as the vibrations prevented the Crazyflie from completing its initialization sequence, since it needs to remain completely still for about one second at startup. The proper fix was to connect the motor to one of the GPIO pins (IO_4) along with GND. For the firmware, I added a new deck driver for setting the IO_4 output to low during initialization and controlling it through a parameter.

Next Steps

The biggest limitation of the current Disco Deck design is the landing. The disco ball extends below the length of the Crazyflie 2.1 Brushless legs, which means the drone can’t take off or land horizontally – not even when using the standard Crazyflie 2.1 Brushless charging dock. To fix this, I’m planning to design a custom charging dock that also works as a stable landing platform for the Party drone.

If you’re interested on the process, you can check out the project repository for any updates.

About a year ago I started working on a Crazyflie CLI (command line interface), with the goal of having a utility for quickly doing simple interactions with the Crazyflie from the command line. Initially I was mostly interested in the console, but realized that it’s neat to also be able to check some logging variables and set some parameters as well. With the new release of the Crazyflie Rust lib I’ve also updated the cfcli to be able to access the memory subsystem, along with some other nifty updates to make it easier to use.

Below is a full list of the updates in the release:

  • Improved user interface
    • Use interactive selection if command line parameters are missing (like what params to set or which memory to display)
    • Animation while connecting
  • Added access to memory subsystem
    • Raw access to memories
    • Initialize, erase and display decoded information from EEPROM and 1-wire memories
    • Configure the Crazyflie EEPROM (i.e set channel, address, speed etc)
  • Set and get multiple parameters at once
  • Generate and flash DeckCtrl configurations via debug probe
  • Bumped all dependencies

If you would like to give it a try just run cargo install cfcli. Have a look at the crates page or the GitHub repository for more information and feel free to leave any suggestions or comments below. Here’s a short video showing the new interactive selection being used: