Blog

The ability to attach expansion decks to the Crazyflie platforms without modifying their electronics allows experimenting with different hardware components. Most existing decks contain different types of sensors that are used for positioning and collecting data. On this Fun Friday project that has been running for the past couple of months, I explored adding mechanical principles to the Crazyflie with the long-term goal to create a working claw to grab and transfer objects.

The claw

The claw mechanism is built on a DC motor. The motor shaft is connected to a worm gear, which drives the claw to open or close depending on the direction of rotation. All the parts are 3D printed and designed from scratch.

The deck

Making the DC motor rotate in both directions requires reversing its polarity, which can be done using an H-bridge. So, the deck controlling the claw, is essentially an H-bridge that uses VCC 3V, GND and 4 GPIO pins on the Crazyflie. This way it can be compatible with the Lighthouse positioning deck. The circuit consists of 4 Mosfets (2 P-type and 2 N-type) and 2 pull-down resistors.

How it works

When designing a custom deck for the Crazyflie, you need to initialize it with its own drivers. The drivers for the H-bridge deck contain 2 basic functions; the one that opens the claw and the one that closes it. They are triggered using 2 float parameters (clawOpen and clawClose), and remain active for the number of milliseconds specified by the value of each parameter.

Experiments

Since the entire claw setup weighs 29g, I used 2 Crazyflie 2.1 Brushless drones, to equally share the weight, while one of them controls the claw. Together, they can lift up to 74g. A fishing line is attached underneath each drone and the claw can slide along it, keeping it always centered between them. For the load, I used a Crazyflie 2.1+ with a lighthouse deck attached and its motors removed, to reduce its weight. When the script starts, the initial positions are collected and a flight sequence for the swarm is created based on them. Then, the swarm takes off and approaches, grabs, lifts and transfers the load.

Next steps

The initial goal of grasping and transferring objects with a flying claw has been achieved. However, in the future I plan to make the system more robust and easy to use. Some points that I might focus on:

  • Making the whole setup lighter – replace the current motor with a lighter one, print with lighter materials.
  • Improve the controller tuning to damp the oscillations and make the flight more stable.
  • Implement a control system to keep track of the claw’s state – add limit switches.

Imagine a drone that can fly indefinitely, autonomously recharging and navigating its environment with minimal human intervention. For corporate innovators designing proof of concept solutions or researchers seeking to push the boundaries of autonomous systems, Bitcraze’s Infinite Flight project represents a novel opportunity.

Since Bitcraze first introduced the “Infinite Flight” concept in 2023, the idea of a Crazyflie drone that can operate for days, autonomously recharging and executing missions, has steadily moved from experiment to practical tool. For those working in robotics, automation, or research, this is a quick update on what’s changed and why it matters.

What’s Changed Since the last Infinite Flight?

Hardware and Firmware Improvements

  • Crazyflie 2.1 Brushless now features improved power efficiency and longer flight times, which is essential for multi-day operation.
  • Charging Dock Upgrades: The move from Qi wireless to contact-based charging has made energy transfer more reliable and reduced cycle downtime.
  • Firmware Stability: The latest firmware (2025.02) brings fixes for brushless ESC flashing, improved default parameters, and more robust long-duration performance.
  • Host Software: The cfclient now uses PyQt6 for better graphical performance, and cflib’s new full-state setpoints offer more precise control.

Navigation and Autonomy

  • Recent work on visual route following enables Crazyflie to retrace long paths using snapshot-based visual homing, reducing drift even on resource-constrained hardware.
  • The autopilot’s app layer now makes it easier to implement custom, persistent behaviors without deep firmware changes.

Real-World Applications of Infinite Flight

Research and Industry Applications

  • Environmental Monitoring: Continuous data collection for air quality or wildlife studies, where drones need to operate for days at a time.
  • Industrial Inspections: Persistent monitoring of infrastructure like wind farms or power grids, reducing the need for human intervention.
  • Swarm and Formation Flight Research: Some labs are using Crazyflie to simulate spacecraft formation flying or to test swarm coordination algorithms over long durations.
  • Route Following: The new visual homing approach allows for reliable, repeatable long-range missions, which is especially valuable for mapping or inspection tasks.

Why Infinite Flight Matters

Long-duration, autonomous operation is a key enabler for real-world robotics. The recent hardware and software updates make Crazyflie a more practical platform for those kinds of experiments-whether you’re working on persistent autonomy, adaptive navigation, or multi-agent systems.

If you’re experimenting with similar ideas or have a use case that could benefit from multi-day drone operation, it might be worth a look at the latest Infinite Flight developments. As always, feedback and collaboration from the community are welcome.

Start your Infinite Flight Now

Ready to experience the power of uninterrupted autonomous flight? The Infinite Flight Bundle equips you with all the essential tools to keep your Crazyflie 2.1 Brushless airborne around the clock.

The package leverages the Lighthouse positioning system, providing precise on-board tracking across a 5x5x2 meter area. With accuracy reaching below 10 cm and minimal jitter, your drone can safely navigate its flight path while autonomously docking on a charging pad. Once recharged, it’s ready to lift off again—enabling continuous flight operations without manual intervention.

Fredrik Ehrenstråle joins Bitcraze as Strategic Growth Director

Have you ever wondered what could happen if open robotics were truly accessible to everyone — researchers, educators, and innovators alike? That’s the vision that drew me to Bitcraze, and why I’m thrilled to share that I’ve joined the team as Strategic Growth Director.

Bitcraze isn’t just a technology company — it’s a community of curious, collaborative people who believe in making robotics both powerful and playful. From my very first conversation with the team, I felt the energy and integrity that set this place apart.

Over the past decade, I’ve had the privilege of working with organizations big and small, translating complex tech into real-world impact. What excites me most is helping people like you turn bold ideas into reality — whether you’re pushing the boundaries of research, inspiring students in the classroom, or building new industrial solutions.

At Bitcraze, I’ll focus on finding new ways for us to grow, building partnerships that matter, and making sure our story resonates with the people who can benefit most. But more than anything, I want to listen and learn from this amazing community.

If you’re curious about what we’re building, have thoughts on the future of robotics, or just want to swap ideas, I’d love to connect. Let’s shape what’s next together!

In just two weeks, we’re packing our Crazyflies and heading off for a busy and exciting couple of events: ICUAS (International Conference on Unmanned Aircraft Systems) and ICRA (IEEE International Conference on Robotics and Automation) in Atlanta.
This year is a bit different for us: not only will we be showing some new prototypes, but we’re also giving three presentations across the two conferences.
In this post, we’ll share a quick overview of what we’ll be presenting and a first glimpse at the prototypes we’re bringing along.

ICUAS 2025

Charlotte, N.C.
May 14-17

First stop: ICUAS, where we’re proud to sponsor the competition. Teams will be using Crazyflies both in simulation and real life to deploy UAV teams in an urban environment to identify threats – and we’ll be there to support them!

But that’s not all:
As part of the workshop “Embodied-AI for Aerial Robots: What do we need for full autonomy?“, Rik will present a keynote titled “Crazyflie and the Realities of Edge AI.” The talk shares practical lessons from bringing AI onto the Crazyflie, covering challenges with complex toolchains, fragmented ecosystems, and the gap between expectations and real-world constraints, along with a look at how new hardware developments could reshape what is possible for small aerial robots.

It’s our first time attending ICUAS, and we’re thrilled to not only be part of the competition but to actively contribute to the technical discussions.

ICRA 2025

Atlanta, GA
May 19-23

Just two days after ICUAS ends, we’re heading straight to ICRA in Atlanta.
You’ll find us at booth 131, right in front of the Tech Talk stage – come by and say hello!

At ICRA, Rik will speak in the workshop “25 Years of Research in Aerial Robotics“, giving a talk titled “Crazyflie and the Art of Getting Where You Meant To Go“. He’ll reflect on Bitcraze’s journey through the classic aerial robotics challenge of positioning, from the early days of building a tiny, modular flying PCB to supporting researchers around the world.

We’ll also take part in the Undergraduate Robotics Education Forum on May 22nd, where Barbara will be presenting a poster about Crazyflie as an educational platform.

New Prototype Sneak Peek

As always, we’re bringing prototypes for upcoming products:
We’ll be showing a prototype of a straightforward camera deck with WiFi streaming, aimed at adding basic visual capabilities to the Crazyflie.


We’ll also bring an updated demo setup with even more Brushless Crazyflies and charging docks — combining the best parts of last year’s ICRA presentation and our current “fish tank” office demo.
The demo will also feature prototype High Power LED decks, a new product we’re currently working on.

If you’re curious about what’s next for Crazyflie, this is a perfect chance to get an early look and chat with us about it!

Bring Your Posters!

Last year, we decorated the Bitcraze office with posters from researchers working with Crazyflies, and it turned out amazing. We’d love to do it again!
If you have a poster you’re proud of featuring Crazyflie, bring it by the booth – we’ll swap it for a little Bitcraze surprise.


All in all, it’s shaping up to be two incredible weeks.
Whether you’re attending ICUAS, ICRA, or both, stop by to see the Crazyflies in action, hear what we’re working on, and chat with us.
We can’t wait to reconnect with old friends and meet new ones — see you soon!

For quite some time now we have had mobile apps that can be used to control the Crazyflie 2.x quadcopter. There is one iOS and one Android app available. There used to be a prototype of a Windows phone app but it has not survived the demise of Windows on phone (fun fact, the windows phone app can be compiled to run on XBox, however there is no USB access in there so it is quite useless). In this blog post I want to talk about the state of the apps and a possible future for them. As usual with me, the future should include a bit ot Rust :-).

Android app

The Android app is the oldest of the mobile apps, it has been created originally to be used with a Crazyradio conncted to an Android phone over USB. Then, when we released Crazyflie 2.0 with Bluetooth Low Energy, BLE was added to the app to be able to Connect to a Crazyflie without radio attached.

Over the years, the Android app has mainly been maintained by FredG, one of the very first Crazyflie contributors. The app supports controlling Crazyflie using touch-control as well as using an Android-supported Gamepad. It also has support for showing the Crazyflie console, controlling some decks and assisted flight using the flow deck.

It also supports updating the Crazyflie firmware using a Crazyradio connected on USB. This functionality is unfortunately broken since we altered the update process when changing the Crazyflie bluetooth stack last year.

The Android app is also working on Chromebook. This means that it can be used to fly the Crazyflie form a chromebook using Crazyradio of BLE. This is one of the only way to control the Crazyflie from Bluetooth on a laptop.

iOS app

The iOS app is newer and much simpler. It has had a couple of really good contribution over the years but overall it has seen much less development than the Android app. I have tried to keep it up and working but nothing more so far.

The iOS app was released when we made the Crazyflie 2.0. Since iOS does not let us communicate with USB devices, it can only work using Bluetooth Low Energy. It can control the Crazyflie using touch control as well as motion control using the IPhone gyroscope.

The iOS app also had support for updating the Crazyflie over Bluetooth, however, like for the Android app this is now broken and it has been removed in a recent release. I hope to be able to add it back soon.

With the advent of the Apple Silicon Mac, the iOS app is now also a Mac app. Like for the Android app on Chromebook, this gives the unique ability to communicate with the Crazyflie over Bluetooth from a computer. However it still has no USB support for Crazyradio and until we implement Gamepad support there is no way to control the Crazyflie from a Mac using the app.

The future

Some of the biggest issues for the development of the mobile app so far has been a lack of specification and the difficulty of re-implementing Crazyflie protocol for each app.

For the former, the apps have been created at a time where flying the Crazyflie manually was one of the major use-case. Nowadays, it is much more common to fly autonomously. This means that the apps should be able to do more to be really useful. Manual flight might still be needed to test the Crazyflie or just to play around. But the app could also have a much greater use for things like assisting in setting up positioning system or swarms. We are still not sure what would be needed or useful yet so if you have any ideas please tell us here as a comment or on Github discussions.

For the later, the difficulty of re-implementing the Crazyflie lib, this is something we have had problem with on multiple front. For example this is also a problem for ROS support and for Crazyswarm. The main problem is that the official Crazyflie lib is implemented in Python, and Python happens to not be a good choice for most cases due to limited portability and performance. One solution we have been imagining and working towards is to implement the Cazyflie lib in Rust and then implement binding for Python, C/C++, Java and Swift. This will cover our current python client, ROS, Crazyswarm as well as all the mobile app. It should allow to get much more done much more easily on mobile, since we will not have to start by re-implementing the wheel each time and will be able to focus on actual functionalities.

One idea, would be to start now with implementing the Crazyflie update algorithm in Rust and to use is from python and the mobile apps. This is a good first target, since this is a non-trivial really annoying piece of code in all languages, and it is also one that must be as bug-free as possible. So having a single implementation that is well tested and can be used everywhere would be very beneficial to the Crazyflie ecosystem.

I hope I managed to convey where we are and where we want to go with the mobile app. If you have any feedback please tell us about it.

As some of you may have noticed, the current LED-ring deck doesn’t play nice with the Crazyflie 2.1 Brushless. The culprit? A resource clash between the DSHOT motor signals and the WS2812 LED driver used for the LED-ring.

But good news! We’re prototyping a new LED deck that solves the conflict by switching to I2C communication. Not only does this fix the compatibility issue, it also gives us a chance to improve its features. Here’s what we’ve improved so far:

  • Using a highly efficient high powered LED
  • DC/DC driving circuitry to improve LED driving efficiency
  • 1W on each channel (red, green, blue, white)
  • LEDs on both sides so it can be mounted both on top or on bottom of the Crazyflie
LED-deck mounted underneath a Crazyflie 2.1 brushless
LED-deck with a 3D-printed diffuser mounted underneath the Crazyflie 2.1 brushless

The LED we’re using is very powerful and the light is emitted from a small area, so a light diffuser is needed to get a more pleasant light. Designing something that can be manufactured is the next step of the project. Make sure to follow our blog to get more updates on this project.

We’ve got an exciting month ahead – in just a few weeks, we’re heading off to not one, but two amazing conferences! It’s going to be a whirlwind, but we couldn’t be more thrilled to be part of these events, meet fellow robotics enthusiasts, and show off some cool demos. Here’s where you’ll find us:

First stop: ICUAS

We’re kicking things off with ICUAS (International Conference on Unmanned Aircraft Systems), where we’re proud to be official sponsors of the competition. We’ll be present there to help and support the constestants of the competition, that are going to use the Crazyflies in simulation and in real life. The teams will need to deploy a team of UAVs in an urban environment to locate and identify threats.

It’s our first time attending ICUAS, so this is a brand new adventure for us – and we can’t wait to dive in and see what it’s all about!

Next up: ICRA

Just two days after ICUAS wraps up, we’re heading straight to ICRA – this year taking place in Atlanta. You’ll be able to find us at booth 131, right in front of the Tech Talk stage. If you’re attending, definitely come say hi!

We had the honour to be invited to be part of the workshop “25 years of arial robotics: challenges and opportunities“. Rik will talk there on the 23th of May at 16.10; covering Bitcraze’s history and the challenges we’ve faced in positioning a nanocopter – all in just 10 minutes. We’ll also take part in the forum on Undergraduate Robotics Education Programs on the 22th of May. We’ll have a poster presenting the Crazyflie as an educational platform.

These are all fantastic opportunities to highlight what makes our platform special and to exchange ideas with you! If you’ve got a paper or publication featured at ICRA, we’d love to hear about it – email us at contact@bitcraze.io, leave a comment below this post, or drop by our booth.

Demo

We’re bringing back our trusted demo setup – but this time, with more Brushless units and charging docks! It will be a version between what we presented at the last ICRA and what we call “the fish tank demo” we have now at the office.

We’ll also be bringing along some prototypes and new decks we’re currently working on – so if you’re curious about what’s coming next for Crazyflie, this is your chance to get a sneak peek and chat with us about it!

Give us your posters!

Last year, we collected posters from proud participants to decorate the office, and it turned out amazing – so we’re doing it again! If you’ve got a cool poster featuring our products and aren’t sure what to do with it after your presentation, come by our booth. We’d love to swap it for something a little extra special.

All in all, it’s shaping up to be a busy, exciting, and (hopefully) couple of weeks. Whether you’re at ICUAS or ICRA, stop by, chat with us, and see the Crazyflies in action. We’re looking forward to reconnecting with old friends and meeting new ones – see you there!

There has been some extended work lately related to the Lighthouse positioning system. The goal of this work is to expand the maximum base station number to 16 enabling the system to cover larger areas and support more complex use cases.

First Lighthouse 16 RP2350 prototype mounted on a development board.

Previous work

One previous attempt to enable multiple base stations using the current lighthouse deck left us with a highly untested “hacky” solution. After flashing the Crazyflie with the proper firmware, this solution requires to strategically position the base stations so that no more than 4 are visible at any given time. Then, the geometry estimation that is normally carried out by the cfclient has to be done through the multi_bs_geometry_estimation.py script in the cflib.

Last year we developed a prototype deck, used in last year’s holiday video, that had a bigger FPGA to receive the lighthouse signals and an esp32 to be able to decode and filter most of the lighthouse pulses onboard the deck. This approach ended up not working for us since it still included the moderately-hard-to-develop FPGA and the algorithm we implemented in the esp32 to identify lighthouse V2 pulses happened to be not fast enough to handle enough base stations.

Current limitations

A key factor that currently limits the maximum number of usable base stations is the Lighthouse deck which can’t handle more than 4 visible base stations at a time. Additionally, the Crazyflie’s STM32 is doing all the filtering and 16 base stations generate so much data that it would exceed the compute and memory budget we have in the Crazyflie. This was one of the main reasons to add a MCU in the deck of our last-year prototype.

Ongoing progress

The last couple of months we have redesigned a new LH-16 deck containing a RP2350 microcontroller so that part of the computation and filtering can take place on the deck, rather than on the Crazyflie. With a deck like this, it should be possible to receive large amounts of data from the base stations and filter some of it out to finally estimate the Crazyflie’s position in the Crazyflie’s STM32.

This deck has been designed to run a firmware developed by Said Alvarado-Marin from the AIO team at Inria in Paris. This firmware is able to acquire, decode and identify the FM1-encoded LFSR data stream we get from the base stations without the help of an FPGA or a big look-up table. This allows to greatly simplify the hardware and software by using only one microcontroller on the deck.

We are currently bringing-up the prototype and hope to be able to soon fly in our lab with 16 base stations. We will also be looking at making a standalone lighthouse receiver for other robots and applications. For the curious: the board under the deck in the picture in a debug board that contains everything we might need for making a standalone receiver plus everything needed to bring-up and debug the deck until we have it ready to fly.

This year at the Embedded Linux Conference Europe 2025 the 25-27th of August in Amsterdam, there will be a Robotic and simulation sub-track!

This is a follow-up of the Fosdem robotic and simulation I and Kimberly helped organize at Fosdem 2025. The Embedded Linux Conference is taking place this year in the Open Source Summit Europe and we helped by providing some insights on what a robotic track could look like!

This is a very interesting open-source conference, on the opposite side of the spectrum to Fosdem. We are very excited that there is interest in organizing robotics talks there as well, since there are a lot of very important and great open-source projects in the robotic space. I will definitely be joining this year!

This track will explore how open-source technologies are shaping the robotics industry, from software frameworks to simulation projects. There will be talks about tools and their impact, offering insights into both their development process and real-world implementation. If you are interested in giving a talk about your project, you can find the Call for Proposal on the OSS website. The deadline for proposals is the 14th of April and speakers get to attend OSS for free which is a nice perk :-).

A last technical note, the robotic and simulation subtrack, is intended to contain only full 40min talks. So to summit your talk, you need to choose the “Embedded Linux Conference” track, “Robotics” topic and “Session Presentation (40min)”.

Human Robot Interaction (HRI) is a conference that brings together academics and industry partners to explore how humans are interacting with the latest developments in robotics. The conference is held yearly and brings together the many relevant disciplines concerned with the “H” part (cognitive science, neuroscience), the “R” part (computer science, engineering) and the I part (social psychology, education, anthropology and most recently, design).

This year it was in Melbourne (my home city) and I was so grateful to be given the chance to demonstrate a system from my PhD studies called “How To Train Your Drone” in what was its final hurrah, a retirement party! Running the demo was a pleasure, especially with the supportive and curious HRI crowd at such a well organised event .

The take home message from this demonstration was this:

If you let the end user shape, with their hands, the sensory field of the drone, they then end up with an in-depth understanding of it. This allows the user to creatively explore how the drone relates to themselves and their surrounding environment.

What do we mean by sensory field? Its the area around the drone where it can “feel” the presence of those hand-mounted sensors, represented by the grey and red spheres in the figure below. Initially, the drone has no spheres and therefore cannot respond at all to the user’s movement. But by holding the hands still for a few seconds the user can create a spherical space, relative to the drone where the drone can sense their hands and follow them.

These spheres are “part of the drone’s body”, and so they move with the drone. So in a way you are kind of deciding where the drone can “feel” whilst also piloting it. Should it be sensitive in the space immediately in front of it? Or either side of it?

But shouldn’t it just be everywhere?

Good question! We think the answer is no, and for two reasons:

  1. What we can and cannot sense as humans is what makes us human. It also allows us to understand other humans. E.g. We don’t deliver verbal information directly into other people’s ears at max volume because we have ears and we know that sucks. Nor do we demonstrate how to perform a task to someone with their back turned to us. So by the same token, knowing how a machine senses the world also teaches us how to communicate with it. Furthermore, shaping how a machine can sense the world allows us to creatively explore what we can do with it.
  2. To quote science writer Ed Yong, “Nothing can sense everything and nothing needs to”. Meaning we can get the job done without having to ingest insane amounts of data and even more insane amounts of compute. By cumulatively building an agent’s capacity, in context, with end users, we could actually end up with agents that are hyper specialised and resource efficient. A big plus for resource constrained systems like the Crazyflie and our planet at large.

If you are interested in reading more about this research then please check out this paper (if you like to read) or this pictorial (if you like to look at pictures). Or just reach out in the comments below!