crazyflie

UAV Recharging for Remote Applications

Unmanned aerial vehicles (UAV) are invaluable to challenging remote applications, including coastal monitoring, surveillance for safe passage in icy waterways, and search-and-rescue missions. However, after deployment in a remote setting, the functional life of the multirotor is ultimately limited by its battery life. The research we continue to investigate is the ability to cooperatively and autonomously land a multirotor on an uncrewed surface vessel (USV) for recharging. We address this problem in real time with safe control algorithms that we apply on a Crazyflie.

Approach

Our approach enables the Crazyflie to cooperatively coordinate, with a simulated USV, a safe landing in severe wave conditions. It is critical to the autonomy of the system that the agents do not know when or where they’re going to land at the outset, they are cooperating in real time to make these determinations. The novelty of this work is three primary contributions:

Learning a Spatial-Temporal Wave Model as a Gaussian Process

We first learn the local tilt model, representative of the spatial and temporal impact of waves on the tilt angle of a USV, using Gaussian Process (GP) regression. Prior to the execution of the landing, the USV collects Nd noisy observations D = {q, t, φ²}, where q is the position of the USV, t is time, and φ is the tilt angle of the USV. We use GP regression to learn the spatial-temporal tilt model: fw(q, t) = (φ(q, t))² . The predicted tilt and uncertainty, conditioned on the observed data D, at a query point a = [q, t] can be inferred using the posterior distribution.

Distributed Model Predictive Control

Our proposed model predictive control (MPC) architecture combines standard tracking MPCs for the Crazyflie and USV and augments them with additional artificial goal locations. These artificial goals enable the vehicles to coordinate without prior guidance. Each vehicle solves an individual optimization problem for both the artificial goal and an input that tracks it but only communicates the former to the other vehicle. The MPC integrates, into the cost functions for both vehicles, the learned mean and uncertainty quantification of the spatial-temporal wave model from the GP regression. This encourages the agents to converge to calmer waters enabling safer landings in variable wave conditions.

Low-Cost USV Simulation Testbed Platform

To validate the proposed MPC scheme for landing on a USV, we simulate the spatial-temporal motion of a USV in waves of variable intensity using a custom tilting platform. The custom tilting platform has two degrees of freedom (roll, pitch) and is affixed to the deck of a differential-drive unmanned ground vehicle (UGV), the ClearPath Robotics Husky. We selected a differential-drive UGV to replicate the motion of a broad range of USVs, including those with differential drive and conventional rudder steering. Our platform is low-cost, modular, and open source, enabling rapid testing and benchmarking of UAV-USV landing strategies indoors before going onto the water which is high-risk and expensive. 

Block diagram of our proposed distributed model predictive control scheme.
Overview of the USV simulation testbed tilting platform for ground vehicles. Left: the platform is shown mounted on the deck of the ClearPath Robotics Husky with the Crazyflie 2.1 hovering above. Center: a closer view of the landing pad configuration and platform components. Right: Section view showing the ball-and-socket joint and the linkage mechanism.

Experimental Setup

Full system architecture.

We evaluate the proposed MPC scheme for UAV-USV cooperative landing in indoor experiments using the Crazyflie 2.1 and our tilting platform. These experiments represent a scaled-down version of real-world harsh wave conditions. Amplitude and frequency are informed by a survey of waves along the coastlines of the Great Lakes of North America. We select a spatially decaying sine wave whose amplitude decreases gradually with increasing x-position. We expect the Crazyflie and the simulated USV to cooperatively select the safest landing spatially and temporally by learning the GP and incorporating it the MPC scheme.

The UAV MPC is run off-board on a Thinkpad X1 Carbon with Intel Core i7-1270P Processor. The MPC runs at a frequency of 50Hz and transmits control inputs to the Crazyflie via long range Crazyradio USB. The Raspberry Pi onboard the Husky runs its MPC at 10Hz. The two vehicles communicate their goals using ROS topics on a local WiFi network and receive their own pose feedback at 240 Hz from a VICON motion capture system.

Experimental Results

Experiment 1. We define the tilt model. Our proposed distributed MPC scheme can locate a low-tilt landing location from all six initial platform positions.
Experiment 2. We learn the wave model using a GP. The MPC scheme can locate a low-tilt landing location from all three initial platform positions.

In the first set of experiments, we assume no uncertainty in the tilt model. We compare four MPC weighting strategies ranging from a purely cooperative strategy in red (where neither vehicle weights wave tilt in the MPC, neglecting spatial-temporal tilt motion), to our proposed strategy in blue (where both vehicles weight wave tilt highly in the MPC). In this set of experiments, our proposed approach (blue) reduces the tilt angle of the platform at landing by between 68-89% and results in a 53% increase in landing success rate over the purely cooperative strategy (red).

In the second set of experiments, we learn the wave tilt model as a GP. We compare the purely cooperative strategy in red from before to our proposed strategy in blue where we weight the posterior mean cost from the GP regression. In this set of experiments, our proposed approach (blue) reduces tilt angle of the platform at landing by 23-32% and results in a 47% increase in the landing success rate over pure cooperation (red).

Future Work

In the work presented above we assume that there is a local region with calm waves that can be reached by both vehicles to then perform a safe landing. However, in practical scenarios, spatial-temporal assumptions are not realistic if an emergency landing is necessary or if time and resources are constrained. Recently, we explored quadratic MPC strategies for landing a Crazyflie on the tilting platform in high frequency high amplitude conditions. In these strategies we include optimization costs that weigh position, attitude, and altitude errors between the multirotor and the platform.

Though all strategies successfully land in low-frequency, low-amplitude conditions, they have low success in the higher-amplitude conditions. Therefore, designing a safe controller that is robust to a wide range of wave amplitudes and frequencies is an ongoing research area.

Poster presented at the workshop ’25 Years of Aerial Robotics: Challenges and Opportunities’ at ICRA 2025.

Links

Debugging the Crazyflie using the debug-adapter kit gives you direct access to the STM32 with a hardware debugger. It makes it possible to pause execution, step through code, and inspect what’s happening in real time.

Even if you’re working mostly at a higher level, having this kind of visibility can be a huge time-saver when something unexpected happens. It’s a tool I use frequently when tracking down firmware issues or verifying low-level behavior.

We already have documentation for debugging the STM32 on the Crazyflie using ST-LINK and VS Code, but there are still a few missing pieces; like how to use J-Link, how to debug other platforms (like the Crazyflie 2.1 Brushless), and how enabling debug builds can help.

Debug Build

If you’re debugging and your breakpoints aren’t landing where you expect, or stepping seems unpredictable, or variables aren’t visible, it’s probably because the firmware was built with compiler optimizations enabled. When you select a debug build, the build system disables optimization by setting the compiler flag -O0, which preserves line-by-line correspondence between source code and machine instructions. This makes stepping and inspecting variables much more reliable.

To enable a debug build, you need to set the CONFIG_DEBUG option in your Kconfig configuration. You can do this, for example, by running make menuconfig in a terminal. Then navigate to Build and debug options and select Enable debug build. After changing Kconfig options, re-run make to rebuild the firmware with the new configuration.

Enabling debug build Kconfig option through menuconfig

VS Code Debug Configuration

With debug builds enabled, you’ll get more predictable stepping and reliable breakpoints. The next step is setting up your debugger. Below is a launch.json configuration for VS Code that supports both ST-Link and J-Link, and works with Crazyflie 2.x and the 2.1 Brushless.

If you’re using a J-Link probe, you’ll need to install the J-Link software from Segger.

{
    // VS Code debug launch configurations for Crazyflie 2.x and 2.1 Brushless using J-Link and ST-Link
    "version": "0.2.0",
    "configurations": [
        {
            // ST-LINK configuration for Crazyflie 2.x
            "name": "STLink CF21 Debug",
            "cwd": "${workspaceRoot}",
            "executable": "${workspaceRoot}/build/cf2.elf",
            "request": "launch",
            "type": "cortex-debug",
            "device": "STM32F405",
            "svdFile": "${workspaceRoot}/tools/debug/STM32F405.svd",
            "servertype": "openocd",
            "configFiles": ["interface/stlink-v2.cfg", "target/stm32f4x.cfg"],
            "runToEntryPoint": "main",
            "preLaunchCommands": [
                "set mem inaccessible-by-default off",
                "enable breakpoint",
                "monitor reset"
            ]
        },
        {
            // ST-LINK configuration for Crazyflie 2.1 Brushless
            "name": "STLink CF21BL Debug",
            "cwd": "${workspaceRoot}",
            "executable": "${workspaceRoot}/build/cf21bl.elf",
            "request": "launch",
            "type": "cortex-debug",
            "device": "STM32F405",
            "svdFile": "${workspaceRoot}/tools/debug/STM32F405.svd",
            "servertype": "openocd",
            "configFiles": ["interface/stlink-v2.cfg", "target/stm32f4x.cfg"],
            "runToEntryPoint": "main",
            "preLaunchCommands": [
                "set mem inaccessible-by-default off",
                "enable breakpoint",
                "monitor reset"
            ]
        },
        {
            // J-Link configuration for Crazyflie 2.x
            "name": "JLink CF2 Debug",
            "cwd": "${workspaceRoot}",
            "executable": "${workspaceRoot}/build/cf2.elf",
            "request": "launch",
            "type": "cortex-debug",
            "device": "STM32F405RG",
            "svdFile": "${workspaceRoot}/tools/debug/STM32F405.svd",
            "servertype": "jlink",
            "runToEntryPoint": "main",
            "preLaunchCommands": [
                "set mem inaccessible-by-default off",
                "enable breakpoint",
                "monitor reset"
            ]
        },
        {
            // J-Link configuration for Crazyflie 2.1 Brushless
            "name": "JLink CF21BL Debug",
            "cwd": "${workspaceRoot}",
            "executable": "${workspaceRoot}/build/cf21bl.elf",
            "request": "launch",
            "type": "cortex-debug",
            "device": "STM32F405RG",
            "svdFile": "${workspaceRoot}/tools/debug/STM32F405.svd",
            "servertype": "jlink",
            "runToEntryPoint": "main",
            "preLaunchCommands": [
                "set mem inaccessible-by-default off",
                "enable breakpoint",
                "monitor reset"
            ]
        },
    ]
}Code language: JSON / JSON with Comments (json)

To use this setup with a different platform, like the Flapper, just change the executable field to point to the correct .elf file. By default, starting a debug session in VS Code will erase and reflash the firmware, so make sure the firmware is built beforehand. If you need to attach to a running target without flashing, you’ll need to modify the launch.json to skip loading the binary.

A couple of weeks ago, we were at ICRA 2025 in Atlanta. This year’s ICRA drew over 7,000 attendees, making it the biggest edition yet. We had a booth at the exhibition where we showed our decentralized swarm demo. The setup included a mix of Crazyflie 2.1+ units with Qi charging decks and Crazyflie 2.1 Brushless platforms with our new charging dock. The entire swarm operated onboard, with two Lighthouse base stations for positioning. More details about the setup can be found in the recent swarm demo blog post.

8 Crazyflies flying simultaneously in our decentralized swarm at ICRA 2025

Some of the brushless drones carried our high-powered LED deck prototype to make the swarm more visible and engaging. One of the drones also had a prototype camera streaming deck, which held up well despite the busy wireless environment.

A Different Perspective

This year we were also invited to participate in a workshop: 25 Years of Aerial Robotics: Challenges and Opportunities, where I (Rik) gave a short presentation about the evolution of positioning in the Crazyflie, from webcam-based AruCo marker tracking to the systems we use today.

Usually, we spend most of our time on the exhibition floor, so being part of a workshop like this was a different experience. It was interesting to hear researchers mention the Crazyflie in their work without needing to explain what it is. That kind of familiarity isn’t something we take for granted, and it was nice to see.

The workshop also gave us a chance to talk with both established researchers and newer faces in the field. What stood out most was hearing how people are using the Crazyflie in their work today. It’s very rewarding to see how what we do at the office connects with and supports real research.

Catching Up and Looking Around

One of the most rewarding parts of the conference was the chance to connect directly with people using the platform. We talked to many users, both current and past, and saw new research based on the platform. It was also great to reconnect with Flapper Drones, who build flapping-wing vehicles powered by the Crazyflie Bolt. And it was nice to see HopTo on the exhibition floor for the first time. The company is a spin-off from the Robotics and Intelligent Systems Lab at CityU Hong Kong, which published a Science Robotics paper on the hopcopter concept that used a Crazyflie. We also had the chance to catch up with a maintainer of CrazySim, an open-source simulator in the Crazyflie ecosystem. It’s always valuable to connect with people building on top of the platform, whether through research, hardware, or open-source tools.

Wrapping Up

ICRA 2025 was packed with activity. From demoing the swarm, to the workshop, to hallway conversations, it gave us a lot of valuable feedback and insight. Thanks to everyone who stopped by, joined a talk, or came to say hello.

As we mentioned in a previous blog post, the last couple of weeks have been full of exciting events in the US. We first began our adventure in Charlotte, North Carolina, where we attended the International Conference on Unmanned Aircraft Systems (ICUAS), as platinum sponsors.

We were especially thrilled to be involved because the final stage of the conference’s competition featured Crazyflies, which played a central role in the challenge.

The ICUAS UAV Competition

This year’s competition simulated a search mission in an urban environment. The goal was for teams to identify ArUco markers placed on multiple obstacles, while maintaining line-of-sight and communication among a swarm of three Crazyflies.

Each team’s UAVs launched from a designated base, navigated a known environment, and attempted to locate several targets. The drones relied on an OptiTrack system for positioning and used the AI deck as a camera for image recognition. Constant communication between the base and all UAVs was required throughout the mission.

The event, organized by the LARICS team, combined both simulation and real-world implementation. Their hard work ensured that competitors could smoothly transition their systems from digital models to actual flying drones. What followed was an intense and fun two-day hackathon.

Although the ICUAS UAV Competition drew interest from 26 teams globally, only five finalist teams made it to Charlotte to run their scenarios with real drones. In the end, it was Team Aerial Robotics from the Indian Institute of Technology Kanpur (IITK) who took home first place—congratulations to them!

While the event went smoothly overall, some communication challenges cropped up—solved creatively by placing a radio in the center of the arena. Battery management was also key, with fully charged packs being a hot commodity to maximize flight time.

Research and Presentations

Alongside the competition, the conference featured a wide range of research presentations. We were proud to see Rik present on the AI deck during a workshop focused on embodied AI.

One of the highlights was the Best Paper Award, which—although we missed the talk, was awarded to a team from Queen’s university using the Crazyflie to simulate drone landings on ocean waves. You can read their fascinating paper here:
https://arxiv.org/abs/2410.21674

Final Thoughts

Overall, ICUAS 2025 was a great experience—full of innovation, collaboration, and of course, plenty of flight time. We’re grateful to the organizers, competitors, and everyone who stopped by our booth. Until next time!

We’re happy to announce that release 2025.02 is now available. This update includes fixes and improvements for the Crazyflie 2.1 Brushless, along with stability enhancements for the AI-deck.

Release overview

crazyflie-firmware release 2025.02 GitHub

crazyflie2-nrf-firmware release 2025.02 GitHub

aideck-esp-firmware release 2025.02 GitHub

aideck-gap8-examples release 2025.02 GitHub

cfclient (crazyflie-clients-python) release 2025.2 GitHub, PyPi

cflib (crazyflie-lib-python) release 0.1.28 GitHub, PyPi

Major changes

  • Automatically disarm Crazyflie 30 seconds after landing if manual arming is required
  • Calculate latency, receive uplink RSSI, calculate bandwidth congestion and packet rate in Python library
  • Improved AI-deck GAP8 initialization stability by delaying boot by 5 seconds if the deck is attached
  • Improved AI-deck WiFi streamer stability
  • Improved Crazyflie 2.1 Brushless platform default settings and parameters
  • Crazyflie 2.1 Brushless ESC flashing fixes
  • Modernize Python packaging

It’s hard to believe it’s already been almost a month since the Crazyflie 2.1 Brushless was released. We know some of you have already had the chance to take it for a spin, and we’re really excited to hear what you think.

Here at the office, we have started using them a lot – to discover gaps in the documentation, to test our new features, or simply to make nice trajectories during a Fun Friday as shown here:

We’re constantly amazed by it and the new capacity it brings… But, interestingly, we haven’t received many support questions so far… which has us wondering—did we accidentally make it too good? Jokes aside, we’d love to get your thoughts! Whether you have feedback, questions, or just want to share your experience, we’re all ears.

We have a quick form for you here to fill out – it takes a couple of minutes and would help us a lot:

https://docs.google.com/forms/d/e/1FAIpQLSfjnTdUQtCEv9isk4UbASLbe1AxESGtT8Z2q9OfSba-fxYg7g/viewform?usp=sharing

Of course, we’re always available by email if there is more you wish to say: contact@bitcraze.io.

Drones can perform a wide range of interesting tasks, from crop inspection to search-and-rescue. However, to make drones practically attractive they should be safe and cheap. Drones can be made safer by reducing their size and weight. This causes less damage in a collision with people or the environment. Additionally, being cheap means that the drones can take more risk – as it is less expensive to lose one – or that they can be deployed in larger numbers.

To function autonomously, such a drone should at least have some basic navigation capabilities. External position references such as GPS or UWB beacons can provide these, but such a reference is not always available. GPS is not accurate enough in indoor settings, and beacons require prior access to the area of operation and also add an additional cost.

Without these references, navigation becomes tricky. The typical solution is to have the drone construct a map of its local environment, which it can then use to determine its position and trajectories towards important places. But on tiny drones, the on-board computational resources are often too limited to construct such a map. How, then, can these tiny drones navigate? A subquestion of this – how to follow previously traversed routes – was the topic of my MSc thesis under supervision of Kimberly McGuire and Guido de Croon at TU Delft, and my PhD studies. The solution has recently been published in Science Robotics – “Visual route following for tiny autonomous robots” (TU Delft mirror here).

Route following

In an ideal world, route following can be performed entirely by odometry: the measurement and recording of one’s own movements. If a drone would measure the distance and direction it traveled, it could just perform the same movements in reverse and end up at its starting place. In reality, however, this does not entirely work. While current-day movement sensors such as the Flow deck are certainly accurate, they are not perfect. Every time a measurement is taken, this includes a small error. And in order to traverse longer distances, multiple measurements are summed, which causes the error to grow impractically large. It is this integration of errors that stops drones from using odometry over longer distances.

The trick to traveling longer distances, is to prevent this buildup of errors. To do so, we propose to let the drone perform ‘visual homing’ maneuvers. Visual homing is a control strategy that lets an agent return to a location where it has previously taken a picture, called a ‘snapshot’. In order to find its way back, the agent compares its current view of the environment to the snapshot that it took earlier. The trick here is that the difference between these two images smoothly grows with distance. Conversely, if the agent can find the direction in which this difference decreases, it can follow this direction to converge back to the snapshot’s original location.

The difference between images smoothly increases with their distance.

So, to perform long-distance route following, we now command the drone to take snapshots along the way, in addition to odometry measurements. Then, when retracing the route, the drone will routinely perform visual homing maneuvers to align itself with these snapshots. Because the error after a homing maneuver is bounded, there is now no longer a growing deviation from the intended path! This means that long-range route following is now possible without excessive drift.

Implementation

The above mentioned article describes the strategy in more detail. Rather than repeat what is already written, I would like to give a bit more detail on how the strategy was implemented, as this is probably more relevant for other Crazyflie users.

The main difference between our drone and an out-of-the-box one, is that our drone needs to carry a camera for navigation. Not just any camera, but the method under investigation requires a panoramic camera so that the drone can see in all directions. For this, we bought a Kogeto Dot 360. This is a cheap aftermarket lens for an older iPhone that provides exactly the field-of-view that we need. After a bit of dremeling and taping, it is also suitable for drones.

ARDrone 2 with panoramic camera lens.

The very first visual homing experiments were performed on an ARDrone 2. The drone already had a bottom camera, to which we fitted the lens. Using this setup, the drone could successfully navigate back to the snapshot’s location. However, the ARDrone 2 hardly qualifies as small as it is approximately 50cm wide, weighs 400 grams and carries a Linux computer.

To prove that the navigation method would indeed work on tiny drones, the setup was downsized to a Crazyflie 2.0. While this drone could take off with the camera assembly, it would become unstable very soon as the battery level decreased. The camera was just a bit too heavy. Another attempt was made on an Eachine Trashcan, heavily modified to support both the camera, a flowdeck and custom autopilot firmware. While this drone had more than enough lift, the overall reliability of the platform never became good enough to perform full flight experiments.

After discussing the above issues, I was very kindly offered a prototype of the Crazyflie Brushless to see if it would help with my experiments. And it did! The Crazyflie brushless has more lift than the regular platform and could maintain a stable attitude and height while carrying the camera assembly, all this, with a reasonable flight time. Software-wise it works pretty much the same as the regular Crazyflie, so it was a pleasure to work with. This drone became the one we used for our final experiments, and was even featured on the cover of the Science Robotics issue.

With the hardware finished, the next step was to implement the software. Unlike the ARDrone 2 which had a full Linux system with reasonable memory and computing power, the Crazyflie only has an STM32 microcontroller that’s also tasked with the flying of the drone (plus an nRF SoC, but that is out of scope here). The camera board developed for this drone features an additional STM32. This microcontroller performed most of the image processing and visual homing tasks at a framerate of a few Hertz. However, the resulting guidance also has to be followed, and this part is more relevant for other Crazyflie users.

To provide custom behavior on the Crazyflie, I used the app layer of the autopilot. The app layer allows users to create custom code for the autopilot, while keeping it mostly decoupled from the underlying firmware. The out-of-tree setup makes it easier to use a version control system for only the custom code, and also means that it is not as tied to a specific firmware version as an in-tree process.

The custom app performs a small number of crucial tasks. Firstly, it is responsible for communication with the camera. Communication with the camera was performed over UART, as this was already implemented in the camera software and this bus was not used for other purposes on the Crazyflie. Over this bus, the autopilot could receive visual guidance for the camera and send basic commands, such as the starting and stopping of image captures. Pprzlink was used as the UART protocol, which was a leftover from the earlier ARDrone 2 and Trashcan prototypes.

The second major task of the app is to make the drone follow the visual guidance. This consisted of two parts. Firstly, the drone should be able to follow visual homing vectors. This was achieved using the Commander Framework, part of the Stabilizer Module. Once the custom app was started, it would enter an infinite loop which ran at a rate of 10 Hertz. After takeoff, the app repeatedly calls commanderSetSetpoint to set absolute position targets, which are found by adding the latest homing vector to the current position estimate. The regular autopilot then takes care of the low-level control that steers the drone to these coordinates.

The core idea of our navigation strategy is that the drone can correct its position estimate after arriving at a snapshot. So secondly, the drone should be able to overwrite its position estimate with the one provided by the route-following algorithm. To simplify the integration with the existing state estimator, this update was implemented as an additional position sensor – similar to an external positioning system. Once the drone had converged to a snapshot, it would enqueue the snapshot’s remembered coordinates as a position measurement with a very small standard deviation, thereby essentially overwriting the position estimate but without needing to modify the estimator. The same trick was also used to correct heading drift.

The final task of the app was to make the drone controllable from a ground station. After some initial experiments, it was determined that fully autonomous flight during the experiments would be the easiest to implement and use. To this end, the drone needed to be able to follow more complex procedures and to communicate with a ground station.

Because the cfclient provides most of the necessary functions, it was used as the basis for the ground station. However, the experiments required extra controls that were of course not part of a generic client. While it was possible to modify the cfclient, an easier solution was offered by the integrated ZMQ server. This server allows external programs to communicate with the stock cfclient over a tcp connection. Among the possibilities, this allows external programs to send control values and parameters to the drone. Since the drone would be flying autonomously and therefore low-frequencies would suffice, the choice was made to let the ground station set parameters provided by the custom app. To simplify usability, a simple GUI was made in python using the CFZmq library and Tkinter. The GUI would request foreground priority such that it would be shown on top of the regular client, making it easy to use both at the same time.

Cfclient with experimental overlay (bottom right).

To perform more complex experiments, each experiment was implemented as a state machine in the custom app. Using the (High-level) Commander Framework and the navigation routines described above, the drone was able to perform entire experiments from take-off to landing.

While the code is very far from production quality, it is open source and can be viewed here to see how everything was implemented: https://github.com/tomvand/2020-visualhoming-crazyflie . The PCB used to fit Crazyflie decks to the Eachine Trashcan can be found here: https://github.com/tomvand/cf-deck-carrier .

Outcome

Using the hardware and software described above, we were able to perform the route-following experiments. The drone was commanded to fly a preprogrammed trajectory using the Flow deck, while recording odometry and snapshot images. Then, the drone was commanded to follow the same route in reverse, by traveling short sections using dead reckoning and then using visual homing to correct the incurred drift.

As shown in the article, the error with respect to the recorded route remained bounded. Therefore, we can now travel long routes without having to worry about drift, even under strict hardware limitations. This is a great improvement to the autonomy of tiny robots.

I hope that this post has given a bit more insight into the implementation behind this study, a part that is not often highlighted but very interesting for everyone working in this field.

As 2024 comes to an end, it’s the perfect time to reflect on what we’ve accomplished over the past year. A major highlight has been our work on the Crazyflie 2.1 Brushless. We’re thrilled that it will be available early in the new year! While much of our efforts focused on refining and preparing the platform as a whole, we also introduced some standout features like support for contact charging on a charging pad, perfecting the specially optimized motors, and propeller guards to enhance safety for both users and the drone.

Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!

This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.

The Crazyflie 2.1 Brushless featured on the cover of Science Robotics vol. 9, no. 92

Community

In 2024, Bitcraze had an action-packed year, engaging with the robotics community through numerous conferences, workshops, and live events.

In May, we attended ICRA 2024 in Yokohama. We collected several research posters that now proudly feature at the office. Kimberly presented at the Robotics Developer Day, where she won Best Speaker Award for her impressive live hardware demos with ROS2. We co-organized the ‘Aerial Swarm Tools and Applications’ workshop at RSS 2024 in Delft. Arnaud and Kimberly shared insights on demo-driven development on an episode of OpenCV Live!. Additionally, we had a booth at ROSCon ’24 in Odense, connecting with the vibrant ROS community and showcasing our latest developments.

And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.

We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.

Team

In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.


As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!

Hi everyone! I have a bit of news to share… I’ve decided to leave Bitcraze at the end of 2024. But not before I share with you my latest Fun Friday project that I’ve tried my best to finish up before I leave before my Christmas holiday in December.

Frankensteining the Pololu Robot with the Crazyflie Bolt

During the ROSCon talk about the lighthouse system (see the recording here), I’ve already shown a small example of how the lighthouse system could be used on other robots as well. Here you see a Pololu RPI 2040 (the hyper edition of course), with a slimmed down Crazyflie Bolt and a Lighthouse deck. The UART2 port on the Bolt (pinout is the same as Crazyflie) is interfacing with the UART0 connection on the Pololu (pinout). Then the Pololu’s 3v3 is connected to the vUSB and GND to GND (obviously), so 4 wires in total. Technically, the 3v3 port is not supplying enough power for the Crazyflie on paper, but it seemed to be enough as long as the Crazyflie Bolt doesn’t have motors connected it should be fine. But if anyone would like to do a driving-flying hybrid with this combo, you might need to check the specifications a bit closer. For now, just ignore the red low-battery LED on the Bolt, but if you see it restarting then perhaps give the Pololu a fresh set of batteries.

Since the Pololu RPI 2040 doesn’t have any wireless communication, this can be done through the Crazyflie Bolt and the Crazyradio. I’ve made an app layer variant for the Bolt to forward state estimates and velocity commands; however, it did require a bit of an extra logging variable in the firmware itself. But this allows me to control the Pololu through the CFclient! Since it’s using velocity commands, this means that the mobile app is out though, but perhaps if anyone is interested in getting this rolling, let me know. Also, the screen shows the current X, Y, Z, and yaw estimate of the Bolt transferred to the Pololu with the commands that I’ve given it.

I’d like to have connected this to a differential drive controller to make use of the position setpoints, but unfortunately the AA batteries ran out at the office and I was unable to complete this by the last day. It would have been great to use the Lighthouse positioning for this. Perhaps in the next coming months, I can try to continue with it and have my cats chase an autonomous robot around the house, who knows! If anyone is interested in playing around with this, these are the repositories/branches for both the Bolt and the Pololu:

What is next?

First of all, I’ll take a long holiday in the US, first visiting New York (first time) before I hop over to Tulsa and Santa Barbara to visit family. Early 2025 I’ll be taking a long break, or a mini sabbatical of sorts, where I plan to work on some personal projects but mostly have a breather. I haven’t had a break like this in over 15 years, and given a tough 2023, I can definitely say that I’ve deserved some time off. What will happen after, I will hopefully figure out then, but for sure I will be continuing to co-lead the Aerial Robotics Interest Group at ROS and helping out in support of the Crazyswarm2 project.

I’d like to thank my colleagues at Bitcraze for an amazing 5 years here in Malmö, Sweden, and everyone that I was able to meet through them. I’ve learned a lot in terms of joint software development, code maintenance, community interaction, and, most importantly, having fun during work. I also will never forget the support I received while I was going through cancer treatment, and for that I’m very grateful. I wish you all the best and I hope the Crazyflie continues to thrive, saving more PhD projects as it did mine. Thank you.

It’s been a while since I last talked about hiring! We successfully onboarded our most recent recruit, and now it’s time to start planning for the future.

One of our challenges as a team is that we’re very heavy on engineers and developers. While that’s fantastic for building products, it means we lack expertise in other important areas. That’s why we’re now shifting our focus to bringing in talent to help fill those gaps. We’ve partnered with a recruitment agency once again to help us find the right people for the job.
We’re currently hiring for two distinct roles—here’s what we’re looking for!

Technical sales lead

You will be responsible for developing and implementing sales strategies while exploring both new and existing markets. You’ll take the lead in driving sales and acquiring new customers, becoming the company’s go-to expert on marketing and sales tactics. Your day-to-day tasks will include supporting business development, optimizing sales processes, and proposing effective marketing strategies. This role is perfect for someone with a background in technical sales with a strong strategic mindset and a sense of responsibility.

You can read more about it here.

Technical success engineer

We’re looking for a Technical Success Engineer to provide our customers with technical guidance and product expertise. This role involves offering first-line support, creating documentation and tutorials, and assisting with tech-focused sales efforts. The goal is to ensure a smooth and seamless customer experience while building strong client relationships. It’s an ideal position for a “social developer”—someone with a solid technical background who also excels in communication and enjoys engaging with others.

You can read more about it here.

Both positions are full-time and based at our office in Malmö, Sweden. If you’re curious about why you should join our team, I’ve already shared some of the many reasons why I love being part of Bitcraze.

If you’re interested or have any questions, please send an email to fredric.vernqvist@techtalents.se or contact us at contact@bitcraze.se.