Author: Kimberly McGuire

If you have been following the ROS Discourse on a regular basis, you might have seen a bit more activity on the Aerial Vehicles category than usual. We very recently started an Aerial Robotics Working Group in collaboration with Dronecode Foundation! It will be a community-driven working group initially, but we will hold biweekly meetings on Wednesday at 2:00 PM UTC, and build up a community members and gather information on the ROS Aerial community’s Github organization. This blogpost aims to explain how this working group came to light, what our current plans are and how you can participate.

How did it all begin?

There are actually quite some aerial enthusiasts out there dwelling in the ROS crowd, which became evident when 20-30 people showed up at the impromptu ROScon 2022 aerial roboticists meetup. This was also our first experience with ROScon as Bitcraze, and I (Kimberly) absolutely loved it. The idea popped to be able to be more active in the amazing ROS community, which we started doing with helping out more with the Crazyswarm2 project (see this blogpost) and giving a presentation about it as well. However, we did notice that there wasn’t as much online chatter about Aerial Vehicles on the ROS communication channels. Yes, the Embedded ROS working group led by eProsima (responsible for MicroROS) has done some really cool demos with Crazyflies! And the same goes for any other aerial project, that has probably contributed to some of the other staple projects like NAV2. But there aren’t any working groups that are specific for aerial robotics.

Since PX4 led by Dronecode foundation had similar ambitions to be emerged into the ROS family, since we met in person at the very same ROScon last year, we started talking about possibly starting up a working group. This started with us reaching out to the ROS community for interest with this ROS discourse post and after 25 and more replies, the obvious thing was to set up an first explorative meeting. About 30 people showed up to this, so the message was clear: yes, there is a demand for guidance, structure, and information in the ROS community regarding aerial robotics. Thus, the aerial robotics working group was born!

Current state and plans

One of the observed issues is that we have noticed that is happening is that there there are numerous projects and information about aerial robotics, and perhaps too much. That is because aerial robotics consists of a huge variety of robotic systems in different forms like multicopters or even monocopters (like in the blogpost here) but also hybrid VTOL vehicles, mini blimps (for example this hack we done) and so many more. But as you probably know, aerial vehicles come with their own set of challenges that distinguish them from ground robots, like instability, aerodynamics, and limitations related to their lift capabilities. Therefore, it offers an interesting platform for control theory, autonomy, and swarming and as a result several ROS-related projects have emerged, such as Crazyswarm2, Aerostack2, Kumar Robotics Autonomy Stack and, Agilicious. Moreover, even though a standard ROS interface for aerial robotics has been created some years ago, it has not been enforced or updated since. And also, although courses and tutorials can be found here and there scattered around on multiple projects and autopilot websites to get started with aerial robotics in ROS, but many have found the learning curve to be quite steep and usually don’t know where to start.

Due to the vast amount of systems, software, projects and information out there, we decided to gather all this information in one centralized location as an Aerial Robotics landscape instead of scattering it across various aerial robotics resources, of which we have created a simple repository with markdown files. The idea is to fill this in little by little by info that we get from the working group discussions or other input of users, or research done by ourselves. For that, we will facilitate biweekly meetings, where users will present about their project (like our last meeting about Aerostack 2) or where we engage in discussions on various aerial robotics topics (like Aerial Autonomy stacks in the startup meeting).

Future ambitions

Currently, we don’t have a specific end goal or main project in mind, as we are right at the start of the first discussions and information gathering. That is also why it will be considered a ‘community driven’ working group after some emails back and forth with Open Robotics Foundation, until we reach a stage where the landscape is adequately developed to establish specific development goals. and set up various subprojects for communication, autonomy, platforms and/or education. Additionally, incorporating direct communication protocols within swarms could be of interest, as these are a common use case within aerial robotics. Once we have established more specific development goals, we can apply to be an official ROS working group, and collaborate with other workgroups on overlapping projects. From our perspective, it would be more beneficial for the ROS ecosystem not to create a standalone aerial stack, but enhance the integration of other stacks with aerial vehicles.

Join us!

Currently I (Kimberly) representing Bitcraze and Ramon Roche from Dronecode Foundation will be in the ‘lead’ of the Aerial working group, although we prefer to act as facilitators rather than imposing our own direction. We will try our best not to geek out too much on PX4 and/or Crazyflies alone, so therefore anybody’s input will be crucial! So if you’d like to levitate ROS to new heights, come and join our meetings! Our next meeting is scheduled for Wednesday the 24th of May (2 pm UTC), and you can find the information on this ROS Discourse thread. We hope to see you there!

It is easy to forget that the reason why it is nice to develop for the Crazyflie is because it weighs only about 30 grams. In case something goes wrong with your script or there is a fly-away, you can simply pick it up from the air without worrying about the propellers hitting you. Moreover, when the Crazyflie crashes, it usually only requires a brush off and a potential replacement of a motor-mount or propeller. The risk of damage to yourself, other people, indoor furniture, or the vehicle itself is extremely low. However, things become very different if you’ve built a larger platform with the Bolt or BQ deck with large brushless motors (like with this blogpost), where the risk of injury to people or to the vehicle itself increases significantly. That is one of the major reasons why the BQ deck and the Bolt are still in early access and have been for a while. In our efforts to get it out of early access, it’s time to start thinking about safety features.

In this blog post, we’ll be discussing how other open-source autopilot programs are implementing safety features, followed by a discussion on current efforts for Crazyflie, along with an announcement of the developer meeting scheduled for May 3rd (see below for more info).

Catching the Crazyflie with a net

Safety in other Autopilots

We are a bit late to the game in terms of safety compared to other autopilot programs such as PX4, ArduPilot, Betaflight and Paparazzi UAV, which have been thinking about safety for quite some time. It makes a lot of sense when you consider the types of platforms that run these autopilots, such as large fixed VTOL or fixed-wing vehicles or 10-kilo quadcopters with cinematic cameras, or the degree of outdoor flight regulation. Flying a UAV autonomously or by yourself has become much more challenging as the US, EU, and many other countries have made it more restrictive. In most cases, you are not even allowed to fly if fail-safes are not implemented, such as what to do if your vehicle loses GPS signal. These types of measures can be separated into pre-flight checks and during-flight checks.

Pre-flight checks

Before a vehicle is allowed to fly, or even before the motors are allowed to spin, which is called ‘arming’, several conditions must be met. First, it needs to be checked if all internal sensors, such as the IMU, barometer, and magnetometer, are calibrated and functional, so they don’t give values outside of their normal operating range. Then, the vehicle must receive a GPS signal, and the internal state estimator (usually an extended Kalman filter) should converge to a position based on that information. It should also be determined if an external remote control is connecting to the vehicle and if there is any datalink to a ground station for telemetry. Feasibility checks can also be implemented, such as ensuring that the mission loaded to the UAV is not outside its mission parameters or that the start location is not too far away from its take-off position (assuming the EKF is functional). Additionally, the battery should not be low, and the vehicle should not still be in an error state from a previous flight or crash.

All of these features have the potential to be turned off or made less restrictive, depending on your situation. However, keep in mind that changing any of these may require recertification of the drone or make it fall outside what is required for outdoor flight regulation. Therefore, these should only be changed if you know what you are doing.

Preflight checks documentation

Fail-safe triggers during flight

Now that the pre-flight checks have passed, the UAV is armed and you have given it the takeoff command. However, there is so much more that can go wrong during a UAV flight, and takeoff is one of the most dangerous moments where everything could go wrong. Therefore, there are many more safety features, aka failsafes, during the flight than for the pre-flight checks. These can also be separated into ‘triggers’ and ‘behaviors,’ so that the developer can choose what the UAV should do in case of a failure, such as ‘GPS loss’ to ‘land safely’ and so on.

Thus, there are triggers that can enable the autopilot’s failsafe mechanics:

  • No connection with the remote control
  • No connection with the Ground station or Datalink
  • Low Battery
  • Position estimate diverges or full GPS loss
  • Waypoint going beyond geofence or Mission is not feasible
  • Other vehicles are nearby.

Also, sometimes the support of an external Automatic Trigger system is required, which is a box that monitors the conditions where the UAV should take action in case there is no GPS, other aerial vehicles are nearby, or the UAV is crossing a geofence determined by outdoor flight restrictions. Note that all of these triggers usually have a couple of conditions attached, such as the level of the ‘low battery’ or the number of seconds of ‘GPS loss’ deemed acceptable.

Fail-safe behavior

If any of the conditions mentioned above are triggered, most autopilot suites have some failsafe behaviors linked to those set by default. These behaviors can include the following:

  • No action at all
  • Warning on the console or remote control display
  • Continue the mission autonomously
  • Stay still at the same position or go to a home position
  • Fly to a lower altitude
  • Land based on position or safely land by reducing thrust
  • No input to motors or completely disarming the motors

Usually, these actions are set in regulation, but per trigger, it is possible to give a different behavior than the default. One can decide to completely disarm the vehicle, but then the chances of the UAV crashing are pretty high, which can result in damage to the vehicle or cause harm to people or objects. By the way: disarming is the opposite act of arming, which is not allowing the motors to spin, no matter if it is receiving an input. If you decide to never do anything and force the drone to finish the mission autonomously, then in a case of GPS or position loss, you risk losing your vehicle or that it will end up in areas where it is absolutely not allowed, such as airports. Again, changing these default behaviors should be done by someone who knows what they are doing, and it should be done with careful consideration.

Failsafe documentation other autopilot suites:

Emergencies

Fail-safes are measures that ensure safe flight. However, there will always be a chance that an emergency will occur, which will require an immediate action as well. If the vehicle has crashed during any of its phases or has flipped, or if the hardware breaks, such as the motors, arms, or perhaps even the autopilot board itself, what should be done then?

The standard default behavior for this is to completely disarm the vehicle so that it won’t react to any input to the motors itself. Of course, it’s difficult to do if the autopilot program is on, but at least it won’t try to take off and finish its mission while laying on its side. It might be that a backup system is connected to the ESCs that will take over in case the autopilot is not responding anymore, perhaps using a different channel of communication.

Also, the most important safety feature of all is the pilot itself. Each remote control should have a special button or switch that can put the drone in a different mode, make it land, or disarm it so that the pilot can act upon what they see. In case the motors are still spinning, have a net or towel available to throw over them, disconnect the battery as soon as possible, and make sure to have sand or a special fire retardant in case the LiPo batteries are pierced.

All of the autopilots have some tips to deal with such situations, but make sure to do some good research yourself on how to handle spinning parts or potential LiPo battery fires. I’m just giving a compilation of tips given in the documentation above here, but please make sure to read up in detail!

Safety in the Crazyflie Firmware

So how about the Crazyflie-firmware ? We have some safety features build in here and there but it is all over the code base. Since the Crazyflie is so safe, there was no immediate need for this and we felt it is more up to the developer to integrate it themselves. But with the Bolt and BQ deck coming out of early access, we want to at least do something. As we started already started looking into how other autopilot softwares are doing it, we can get some ideas, however we did notice that many of these are mostly meant for outdoor flight. The Crazyflie and the Crazyflie Bolt have been designed for indoor use and perhaps deal with different issues as well.

Current safety features

This is a collection of safety features currently in the firmware at the time of writing this blogpost. Most safety features in the Crazyflie are up for the developer to double check before and during flight, but these are some automatic once that are scattered around the firmware:

However, if for instance your Crazyflie or Bolt platform loses its positioning in air, or doesn’t have a flowdeck attached before takeoff, there are no default safety systems in check. You either need to catch it, make it land or use an self-made emergency stop button using one of the emergency stop services above.

Safety features in works

As mentioned earlier, we have safety features spread throughout the code base of the Crazyflie firmware. Our current effort is to collect all of these emergency stops and triggers in the supervisor module to have them all in one place.

In addition, since indoor positioning is critical, we want to be notified when it fails. For instance, if the lighthouse geometry is incorrect, we need to see if the position diverges. This check was done outside of the Crazyflie firmware in a cflib script, but it has not been implemented inside the firmware. We also want to provide some options in terms of behavior for these triggers. Currently, we are working on two options: ‘turn the motors off’ or ‘safe land,’ with ‘safe land’ decreasing the thrust while keeping the drone level in attitude.

Furthermore, we want to integrate these features into the cfclient as well. For example, we want to add more emergency safety features to our remote control through the cfclient, and show users how to arm and disarm the vehicle.

These are the elements we are currently working on, but there might be more to come!

Developer meeting May 3rd

You probably already guessed it… the topic about the next developer meeting will be about the safety features in the Crazyflie and the Bolt! We will present the current safety features in the Crazyflie and what we are currently working on to make it better. In this sense, we really want to have your feedback on what you think is important for brushless versions of the Crazyflie for indoor flight!

The Dev meeting will be on Wednesday May the 3rd at 3 PM CEST. Please keep an eye on the discussion forum in the developer meeting thread.

We are excited to announce that we will be having developer meetings on first Wednesdays of every month! Additionally, we are thrilled to be present in person at ICRA 2023 in London. During the same conference, there will be half day workshop called ‘The Role of Robotics Simulators for Unmanned Aerial Vehicles’ so make sure to sign-up! Please check out our newly updated event-page !

Monthly Developer meetings

We have had some online developer meetings in the past covering various topics. While these meetings may not have been the most popular, we believe it is crucial to maintain communication with the community and have interesting discussions, and exchange of ideas. However, we used to plan them ad-hoc and we had no regularity in them, which sometimes caused some of us **cough** especially me **cough**, to create confusion about the timing and location. To remove these factors of templexia (dyslexia for time), we will just have it simply on the first Wednesday of every month.

So our first one with be on Wednesday 5th of April at 15:00 CEST and the information about the particular developer meeting will be as usual on discussions. From 15:00 – 15:30 it will be a general discussion, probably with a short presentation, about a topic to be determined. From 15:30-16:00 will address regular support question from anybody that need help with their work on the Crazyflie.

ICRA 2023 London

ICRA will be held in London this year, from May 29 – June 2nd, atthe ExCel venue. We will be located in the H11 booth in the exhibitor hall, but as the date approaches, we will share more about what awesome prototypes we will showcase and what we will demonstrate on-site. Rest assured, plenty of Crazyflies will be flown in the cage! To get an idea of what we demo-ed last year it IROS Kyoto, please check out the IROS 2022 event page. Matej from Flapper Drones will join us at our booth to showcase the Flapper drone.

We are thinking of organizing a meetup for participants on the Wednesday after the Conference Dinner, so we will share the details of that in the near future as well. Also keep an eye on our ICRA 2023 event page for updated information.

ICRA Aerial Robotic Simulation Workshop

I (Kimberly) will also be present at the ‘The Role of Robotics Simulators for Unmanned Aerial Vehicles‘ workshop on Friday June 2nd. Together with Giuseppe Silano, Chiara Gabellieri and Wolfgang Hönig, we will be organizing a half day workshop focused on UAV-specific simulation in robotics. We have invited some great speakers namely: Tomáš Báča, Davide Scaramuzza, Angela Schoellig and Jaeyoung Lim. The topics will cover multi-YAV simulation to realistic vision-based rending and software-in-the-loop handling for PX4.

Additionally, participants can submit an extended abstract to be invited for an poster presentation during the same workshop. The submission deadline has been extended to April 3rd, so for more information about submission, schedule and speaker info, go to the workshop’s website.

Announcement: We will have a townhall meeting this Wednesday (7th of December) about Crazyradio 2.0 and the ideas about the new com-stack at 15:00 (3 pm) CET. Please follow the discussion here for more info.

As you have been very much aware of already if you have been reading the blog occasionally is that we went to Japan with the entire company to be at the International Conference on Intelligent Robots and Systems (IROS) in Kyoto, Japan. Besides eating great food, singing karaoke, and herding our fully onboard autonomous swarm at our stand, we also had some time to check out what kind of work was done with the Crazyflie in the proceeding papers and talks!

So just some generic statistics first:

  • IROS had 1716 papers accepted
  • We found 14 Crazyflie papers/posters and 2 workshop papers
  • The three biggest topics we found the papers in were: SLAM, Multi-robot systems and Navigation & Motion planning, SLAM

At ICRA this year, we noticed that the Crazyflie/bolt were used to make unconventional platforms, like a mono-copter or transforming the Crazyflie to a Pogo stick. It was interesting to see that now at IROS, the focus seemed to be more on navigation, localization and even SLAM… also with unconventional sensors!

Navigation and SLAM with the Crazyflie

In the summer I (Kim) worked on a summer project with using ROS2 to try SLAM with the standard packages with the Flow deck and Multi-ranger. This was also to present the work at ROScon before that with the Crazyswarm2 project, the Crazyflie can be used as an actual robotic platform too! I’m glad that some researchers already figured this one out already, as there were quite some papers on SLAM! [6] and [12] made use of the flow & multi-ranger but made their own custom algorithms to do SLAM and mapping that was more tailored to the task than the standard SLAM packages out there meant for 360 degree lidars.

Very interestingly, there were several papers that uses unconventional sensors for this as well. [5] used a gas sensor to do both gas source localization and distributing mapping and [10] made their own echolocation deck with buzzer + microphones. Let’s see what other sensors will be explored in the future!

Safe Robot Learning Competition

A special mention goes to the Safe Robot Learning competition, organized by the joined TU Munich and Utoronto’s the Learning system & robotics lab (formally known as the Dynamic Systems lab). In this competition, teams could participate with an online competition where they had to finish an obstacle course in simulation. From those that were successful, the finals were done with a real Crazyflie at a remote testbed in the University of Toronto, where the algorithms were put to the ultimate test! The simulation was done in the safe-control-gym framework [12], and the communication with the real Crazyflie was done with the ROS1 based Crazyswarm. We sponsored the first three places with a couple of Crazyflie bundles, so congrats to the winners!

List of IROS 2022 Papers featuring the Crazyflie

  1. Using Simulation Optimization to Improve Zero-shot Policy Transfer of Quadrotors Sven Gronauer, Matthias Kissel, Luca Sacchetto, Mathias Korte and Klaus Diepold
  2. Downwash-aware Control Allocation for Over-actuated UAV Platforms Yao Su , Chi Chu , Meng Wang , Jiarui Li , Liu Yang , Yixin Zhu , Hangxin Liu
    • Beijing Institute for General Artificial Intelligence (BIGAI)
    • ArXiv
    • IEEE Xplore
  3. Towards Specialized Hardware for Learning-based Visual Odometry on the Edge Siyuan Chen and Ken Mai
    • Beijing Institute for General Artificial Intelligence (BIGAI)
    • IEEE Xplore
  4. Polynomial Time Near-Time-Optimal Multi-Robot Path Planning in Three Dimensions with Applications to Large-Scale UAV Coordination Teng Guo, Siwei Feng and Jingjin Yu
  5. GaSLAM: An Algorithm for Simultaneous Gas Source Localization and Gas Distribution Mapping in 3D Chiara Ercolani, Lixuan Tang and Alcherio Martinoli
    • Ecole Polytechnique Federale de Lausanne (EPFL),
    • IEEE Xplore
  6. Efficient 2D Graph SLAM for Sparse Sensing Hanzhi Zhou, Zichao Hu, Sihang Liu and Samira Khan
  7. Avoiding Dynamic Obstacles with Real-time Motion Planning using Quadratic Programming for Varied Locomotion Modes Jason White, David Jay, Tianze Wang, and Christian Hubicki
  8. Dynamic Compressed Sensing of Unsteady Flows with a Mobile Robot Sachin Shriwastav, Gregory Snyder and Zhuoyuan Song
  9. A Framework for Optimized Topology Design and Leader Selection in Affine Formation Control Fan Xiao, Qingkai Yang, Xinyue Zhao and Hao Fang
  10. Blind as a bat: audible echolocation on small robots Frederike Dumbgen Adrien Hoffet Mihailo Kolundzija Adam Scholefield Martin Vetterli
    • Ecole Polytechnique Federale de Lausanne (EPFL)
    • IEEE xplore
  11. Safe Reinforcement Learning for Robot Control using Control Lyapunov Barrier Functions Desong Du, Shaohang Han, Naiming Qi and Wei Pan
    • Harbin Institute of Technology + TU Delft + University of Manchester
    • Late breaking result poster
  12. Parsing Indoor Manhattan Scenes Using Four-Point LiDAR on a Micro UAV Eunju Jeong, Suyoung Kang, Daekyeong Lee, and Pyojin Kim
    • Sookmyung Women’s University,
    • Late breaking result poster
  13. Interactive Multi-Robot Aerial Cinematography Through Hemispherical Manifold Coverage Xiaotian Xu , Guangyao Shi , Pratap Tokekar , and Yancy Diaz-Mercado
    • University of Maryland
    • Note: Only mention of Crazyflie experiments in presentation
  14. Safe-control-gym: a Unified Benchmark Suite for Safe Learning-based Control and Reinforcement Learning in Robotics Zhaocong Yuan, Adam W. Hall, Siqi Zhou, Lukas Brunke, Melissa Greeff, Jacopo Panerati, Angela P. Schoellig
  15. Distributed Geometric and Optimization-based Control of Multiple Quadrotors for Cable-Suspended Payload Transport Khaled Wahba and Wolfgang Hoenig
  16. Customizable-ModQuad: a Versatile Hardware-Software Platform to Develop Lightweight and Low-cost Aerial Vehicles Diego S. D’Antonio, Jiawei Xu, Gustavo A. Cardona, and David Saldaña

Let us know if we are missing any papers or information per papers! Once the IEEE xplore IROS 2022 proceedings have been published, we will update these too and put them on our research page.

As you probably noticed already, this summer I experimented with ROS2 and connecting the Crazyflie with multi-ranger to several mapping and navigation nodes (see this and this blogpost). First I started with an experimental repo on my personal Github account called crazyflie_ros2_experimental, where I managed to do some mapping and navigation already. In August we started porting most of this functionality to the crazyswarm2 project, so that is what this blogpost is mostly about.

Crazyswarm goes ROS2

Most of you are already familiar with Crazyswarm for ROS1, which is a project that Wolfgang Hönig and James Preiss have maintained since its creation in 2017 at the University of Southern California. Since then, many have used and referred to this work, since the paper has been cited more than 260 times. From all the Crazyflie papers of the latest ICRA and IROS conferences, 50 % of the papers have used Crazyswarm as their communication middleware. If you haven’t heard about Crazyswarm yet, please check-out the nice BAMdays talk Wolfgang gave last year.

Unfortunately, ROS1 will not be there forever and will be phased out anno 2025 and will not be supported for Ubuntu 22.04 and up. Therefore, Wolfgang, now at the Intelligent Multi-robot Coordination Lab at TU Berlin, has already started with the ROS2 port of Crazyswarm, namely Crazyswarm2. Here the same principle of the C++ based Crazyflie server and the python wrapper were been implemented, along with the simple position based simulation and Teleop nodes. Mind that the name Crazyswarm2 is just the project name out of historic reasons, but the package itself can also be used for individual Crazyflies as well. That is why the package names will be called crazyflie_*

Porting the Summer Hack project to Crazyswarm2

The crazyflie_ros2_experimental was fun to hack around, as it was (as the name suggests) experimental and I didn’t need to worry about releases, bugfixes etc. However, the problem of developing only here, is that the further you go the more work it becomes to make it more official. That is when Wolfgang and I sat down and started talking about porting what I’ve done in the summer into Crazyswarm2. This is also a good opportunity to get more involved with the project, especially with so many Crazyfliers using the ROS as well.

The first step was to write a second crazyflie_server node that relied on the python CFlib. This means that many of the variables I used to hardcode in the experimental node, needed to be defined within the parameter structure of ROS2. The crazyflies.yaml is where anything relevant for the server (like the URIs and parameters) needs to be defined. Both the C++ backend server and the CFlib backend server are using the same parameters. Also the functionality of the both servers are pretty similar, except for that logging is only possible on the CFlib version and uploading/follow trajectories is only possible on the C++ version. An overview will be provided soon on the Crazyswarm2 documentation website.

The second step was to make the crazyflie_server (cflib) node suitable to be connected to external packages that I’ve worked with during the hack project. Therefore, there are some special logging modes, that enables the server to not only output topics based on logging, but Pose/Odometry/LaserScan messages along with Transforms. This allowed the SLAM_toolbox to use the data from the Crazyflie itself to create a map, which you can see an example of in this tutorial.

Moreover, for the navigation it was important that incoming Twist messages either from keyboard or from a navigation toolkit were handled properly. Most of these packages assume a 2D non-holonomic robot, but a quadcopter like the Crazyflie needs to first take off, stay in the air and land. Therefore in the examples, a separate node (vel_mux.py) was written to receive incoming Twist messages, first have the Crazyflie take off in high level commander, and keep sending hover commands to keep it in the air until a land service is called.

What’s next?

As you probably noticed, the project is still under development, but at least it is now at a good state that we feel comfortable to presented at the upcoming ROScon :) We also want to include an more official simulation package, especially now that the Crazyflie has recently became part of the official release of Webots 2022b, but we are currently waiting on the webots_ros2 to be released in the ubuntu packages. Moreover, the idea is to provide multiple simulation backends that based on the requirement of the topic (swarms, vision-based etc), the user can select the simulation most useful for their situation. Also, we would like to even out the missing items (trajectory handling, logging) in both the cflib and cpp backend of the crazyflie_server so that they can be used interchangeably. Also, I saw that the experimental simple mapper node has been featured on social media, so perhaps we should be converting that to Crazyswarm2 as well :)

So once we got the most of the above mentioned issues out the way, that will be the time that we can start discussing the official release of a ROS2 Crazyflie package with its source code residing in the Crazyswarm2 repository. In the meantime, it would be awesome that anybody that is interested in ROS2, or want to soon upgrade their Crazyswarm(1) packages to ROS2 to give the package a whirl. The more people that are trying it out and report bugs/proposing fixes, the more stable it becomes and closer it will come to an official release! Please join us and start any discussions on the Crazyswarm2 project github repository.

Last week we went on a nice trip to Delft, The Netherlands to attend the 22th International Mico Aerial Vehicle Conference and Competition, this time organized by the MAVlab of the TU Delft. Me (Kim), Barbara and Kristoffer went there by train for our CO2 policy, although the Dutch train strikes did made it a bit difficult for us! Luckily we made it all in one piece and we had a great time, so we will tell you about our experiences… with a lot of videos!

First Conference day

For the conference days we were placed in the main aula building, so that everybody could drop by during the coffee breaks, right next to one of our collaborators, Matěj Karásek from Flapper Drones (also see this blog post)! In the big lecture hall paper talks were going on, along with interesting keynote speeches by Yiannis Aloimonos from University of Maryland and Antonio Franchi from TU Twente.

In between the talks and coffee breaks, we took the opportunity to hack around with tiny demos, for which the IMAV competition is a quite a good opportunity. Here you see a video of 4 Crazyflies flying around a Flapperdrone, all platforms are using the lighthouse positioning system.

The Nanoquadcopter challenge

The evening of the first day the first competition was planned, namely the nanoquadcopter challenge! In this challenge the goal was to autonomously fly a Crazyflie with an AIdeck and Flowdeck as far as possible through an obstacle field. 8 teams participated, and although most did offboard processing of the AIdeck’s camera streaming, the PULP team (first place) and Equipe Skyrats (3rd place) did all the processing onboard. The most exciting run was by brave CVAR-UPM team that managed to do pass through 4 gates while avoiding obstacles, for which they won a Special Achievement Award.

During the challenge, Barbara also gave a presentation about the Crazyflie while Kristoffer build up the lighthouse positioning system in the background in a record breaking 5 minutes to show a little demo. After the challenge, there were bites and drinks where we can talk with all the teams participating.

Here there is an overview video of the competition. Also there was an excellent stream during the event if you would like to see all the runs in detail + presentations by the teams, you’ll have have a full 3 yours of content, complemented by exciting commentary of Christophe de Wagter and Guido de Croon from the MAVlab. Thanks to all the teams for participating and giving such a nice show :)

The overview video
The full stream of the nanoquadcopter challenge

The Green House Challenge

On Wednesday, we were brought to Tomato world, which is a special green house for technology development in horticulture. Here is where the Greenhouse challenge, which was the 2nd indoor drone competition took place. The teams had to participate with their drone of choice to navigate through rows of tomato plants and find the sick variant. Unfortunately we could not be up close and personal as with the nanoquadcopter challenge, but yet again there was a great streaming service available so we were able to follow every step of the way, along with some great presentations by Flapper drones and PATS! drones among others. For the later we were actual challenged to an autonomous drone fight! Their PATS-x system is made and detect pest insects that are harmful for green house crops, so they wanted to see if they can catch a Crazyflie. You can see in the video here that they manage to do that, and although the Crazyflie lived, we are pretty sure that a real fly or moth wouldn’t. Luckily it was a friendly match so we all had fun!

PATS versus Crazyflie battle

Here is an overview of the Green house challenge. At the end you can also see a special demo by the PULP team successfully trying out their obstacle avoiding Crazyflie in between the tomato plants. Very impressive!

Last days and final notes

Due to the planned (but later cancelled) train strikes in the Netherlands, the full pack were not able to attend the full event unfortunately. In the end Barbara and I were able to experience the outdoor challenge, where much bigger drones had to carry packages into a large field outdoors. I myself was able to catch the first part of the last conference day, which included a keynote of Richard Bomprey (Royal Veterinary College), whose lab contributed to the mosquito-inspired Crazyflie flight paper published in Science 2 years ago.

We were happy to be at the IMAV this year, which marks as our first conference attendance as Bitcraze after the pandemic. It was quite amazing to see the teams trying to overcome the challenges of these competition, especially with the nanoquadcopter challenge. We would like to thank again Guido de Croon and Christophe de Wagter of the MAVlab for inviting us!

IMAV website: https://2022.imavs.org/

Crazyflie IMAV papers:

  • ‘Handling Pitch Variations for Visual Perception in MAVs: Synthetic Augmentation and State Fusion’ Cereda et al. (2022) [pdf]
  • ‘Seeing with sound; surface detection and avoidance by sensing self-generated noise‘, Wilshin et al. (2022)

There are some nice and exciting improvement in the CF-client that we worked on during the summer months! First of all we worked on a toolbox structure, where every tab can be reconfigured as a toolbox as well, allowing it to be docked to the sides of the window. Secondly we have added a new geometry estimation wizard for Lighthouse systems to support multi base-station estimation. Finally we have added a new tab for PID controller tuning, mainly intended for the Bolt.

New tabs, toolboxes and wizards for the CFclient

Toolboxes in the CFclient

Everyone who used the CFclient has experienced the tabs before. Anytime you want to configure the lighthouse system, setup plotting or look at the parameter states, you switch to the appropriate tab to perform your desired action. This is all fine, but sometimes it can be useful to see the contents of two tabs at the same time, maybe you want to watch the graphing of a log variable at the same time as you change a parameter. This is what the combined tab/toolbox feature adds! Any tab can now be converted into a toolbox that can be docked to the side of the window.

Plotter tab with parameter toolbox

In the example above the plotter is displaying the estimated position of a Crazyflie with a Flow deck, while the parameter window is opened as a toolbox. The “motion.disable” parameter was just set to true and we can see that the kalman estimator gets into trouble when it no longer gets data from the flow deck.

To switch from tab to toolbox mode, go to the View/Toolboxes menu and select the window that you want to show as a toolbox. In a similar way, use the View/Tabs menu to turn it back to a tab.

Even though all tabs can be turned into toolboxes, some of them might still look better as tabs due to their design. We hope to be able to improve the design over time and make them more toolbox friendly, contributions are welcome!

Lighthouse Geometry Estimation Wizard

In a blogpost of almost a half year ago, we presented a new multi base station geometry estimation method that enabled the user to include more than 2 base station for flying a Crazyflie. This heavily increases the flight area covered by the base station V2s, as technically it should be able to handle up to 16!

New geometry estimation dialog

However, up until this summer it has been in experimental mode as we weren’t so sure as how stable this new estimation method is, so the only way to use it was via a script in the Crazyflie python library directly, and not from the CFclient. Since we haven’t heard of anybody having problems with this new experimental feature, we decided to go ahead to make a nice multi base station geometry estimation wizard in the CFclient’s Lighthouse tab.

This wizard can be accessed if you go to the lighthouse tab-> ‘manage geometry’ and press ‘Estimate Geometry’. We had to make it a wizard as this new method requires some extra intermediate steps compared to the previous, to ensure proper scaling, ground plane setting and sweep angle recording. If you are only using 2 base stations this seems like extra effort, where you only had to put the Crazyflie on the ground and push a button, but if you compare flight performance of the two methods, you will see an immediate difference in positioning quality, especially around the edges of your flight area. So it is definitely worth it!

First page of the wizard

We will still provide the “simple” option for those that want to use it, or want to geometry estimate only one base-station, as we don’t have support for that for this new estimator (see this issue). In that case, you will have to install the headless version of opencv separately like ‘pip3 install opencv-python-headless’. We will remove this requirement from the cflib itself for the next release as there are conflicts for users who has installed the non-headless opencv on their system, like for the opencv-viewer of the AI-deck’s wifi streamer for instance.

PID tuning tab

The PID controller tuning tab

And last but not least, we introduced an PID tuning tab in a PR in the CFClient! And of course… also available as a toolbox :) This is maybe not super necessary for the Crazyflie itself, but for anyone working with a custom frame with the Bolt or BigQuad deck this is quite useful. Tuning is much handier with a slider than to adjust each parameter numerically with the parameter tab. Also if you are just interested of what would happen if you would increase the proportional gain of the z-position controller of the crazyflie, this would be fun to try as well… but of course at your own risk!

If you are happy with your tuned PID values, there is the “Persist Values” button which will store the parameters in the EEPROM memory of the Crazyflie/Bolt, which means that these values will persist even after restarting the platform. This can be cleared with the ‘Clear persisted values’ button and you can retrieve the original firmware-hardcoded default values with ‘Default Values’ button. Please check out this blogpost to learn more about persistent parameters.

Try it out for yourself!

This client has not been released yet but you can already go ahead and try these new features out for yourself. Make sure to first install the client from source, and then install the CFlib from source, as an update of both is necessary. Also update the crazyflie-firmware to the latest development branch via these instructions, especially if you want to try out the new LH geometry wizard.

And of course, don’t forget to give us feedback on discussions.bitcraze.io or to make an issue on the cfclient, cflib or crazyflie-firmware github repositories if you are hitting a bug on your machine and you know pretty precisely where it comes from.

Now it is time to give a little update about the ongoing ROS2 related projects. About a month ago we gave you an heads-up about the Summer ROS2 project I was working on, and even though the end goal hasn’t been reached yet, enough has happened in the mean time to write a blogpost about it!

Crazyflie Navigation

Last time showed mostly mapping of a single room, so currently I’m trying to map a bigger portion of the office. This was initially more difficult then initially anticipated, since it worked quite well in simulation, but in real life the multi-ranger deck saw obstacles that weren’t there. Later we found out that was due to this year old issue of the multi-ranger’s driver incapability to handle out-of-range measurements properly (see this ongoing PR). With that, larger scale mapping starts to become possible, which you can see here with the simple mapper node:

If you look at the video until the end, you can notice that the map starts to diverge a bit since the position + orientation is solely based on the flow deck and gyroscopes , which is a big reason to get the SLAM toolbox to work with the multi-ranger. However, it is difficult to combine it with such a sparse ‘Lidar’ , so while that still requires some tuning, I’ve taken this opportunity to see how far I get with the non-slam mapping and the NAV2 package!

As you see from the video, the Crazyflie until the second hallway. Afterwards it was commanded to fly back based on a NAV2 waypoint in RViz2. In the beginning it seemed to do quite well, but around the door of the last room, the Crazyflie got into a bit of trouble. The doorway entrance is already as small as it is, and around that moment is also when the mapping started to diverge, the new map covered the old map, blocking the original pathway back into the room. But still, it came pretty close!

The diverging of the map is currently the blocker for larger office navigation, so it would be nice to get some better localization to work so that the map is not constantly changed due to the divergence of position estimates, but I’m pretty hopeful I’ll be able to figure that out in the next few weeks.

Crazyflie ROS2 node with CrazySwarm2

Based on the poll we set out in the last blogpost, it seemed that many of you were mostly positive for work towards a ROS2 node for the Crazyflie! As some of you know, the Crazyswarm project, that many of you already use for your research, is currently being ported to ROS2 with efforts of Wolfgang Hönig’s IMRCLab with the Crazyswarm2 project. Instead of in parallel creating separate ROS2 nodes and just to add to the confusion for the community, we have decided with Wolfgang to place all of the ROS2 related development into Crazyswarm2. The name of the project will be the same out of historical reasons, but since this is meant to be the standard Crazyflie ROS2 package, the names of each nodes will be more generic upon official release in the future.

To this end, we’ve pushed a cflib python version of the crazyflie ros2 node called crazyflie_server_py, a bit based on my hackish efforts of the crazyflie_ros2_experimental version, such that the users will have a choice of which communication backend to use for the Crazyflie. For now the node simply creates services for each individual Crazyflie and the entire swarm for take_off, land and go_to commands. Next up are logging and parameter handling, positioning support and broadcasting implementation for the CFlib, so please keep an eye on this ticket to see the process.

So hopefully, once the summer project has been completed, I can start porting the navigation capabilities into the the Crazyswarm2 repository with a nice tutorial :)

ROScon talk

As mentioned in a previous blogpost, we’ll actually be talking about the Crazyflie ROS2 efforts at ROScon 2022 in Kyoto in collaboration with Wolfgang. You can find the talk here in the ROScon program, so hopefully I’ll see you at the talk or the week after at IROS!

In the first years that I started at Bitcraze I’ve been focused mostly on embedded development and algorithmic design like the app layer, controllers and estimators and such, however recently I started to be quite interested in the robotic integration between the Crazyflie and other (open-source) projects and users. This means that I’ll be dwelling more often in the space between Bitcraze and the community, which is something that I do really enjoy I noticed during the Grand Tour. It also initiated my work with simulators which I think would be very useful for the community too. The summer fun project that I’ve been now working on is to integrate the Crazyflie with ROS2 to integrate standard navigational packages, which will be the topic of this blogpost!

ROS2 Crazyflie Node

So first I worked on the ROS2 node that actually communicates with the Crazyflie directly. I think many of you are familiar with the USC’s CrazySwarm project, of which the ROS2 variant, CrazySwarm2, is already available for most functionalities. Even though the name says CrazySwarm, this can be very easily used for only one Crazyflie too. The CrazySwam2 is currently under more development by the IMRClab of TU Berlin, but please take a look if you want to give it a go!

For now while Crazyswarm2 is still under development, I used the Bitcraze Crazyflie python library to make a more hackish node that just publishes exactly the information I want. I am focusing on the scenario with the STEM ranging bundle, aka the Crazyflie + Flowdeck (optical flow + distance sensor) + Multi-ranger (5 x distance sensors) combo, where the node logs the multi-ranger data and the odometry from the Flowdeck with the Crazyradio and outputs that into necessary /scan and /odom topics. Moreover, it also outputs several tf2 transforms that makes it possible to either visualize it in RVIZ and/or connect it to any other packages and it should react to incoming twist messages as well.

Development with a Simulator

And of course… I went in head first and connected it directly with the SLAM toolbox. I have worked with ROS1 in the past, but I had my first experience working with that package in the course: Build Mobile Robots with ROS2 (by Weekly Robotic Newsletter’s Mat Sadowski), so I couldn’t wait to try it on a real platform like the Crazyflie. However, tuning this was of course more work than I thought, as the map that I got out of it first was mostly a sparse collection of dots. Of course the SLAM toolbox is meant for lidars and not something that provided sparse range distances like the Multiranger. Then I decided to take one or two steps back, and first connect a simulator to make tuning a bit easier.

Luckily, I’ve already started to look at simulators, and was quite far in the Webots integration of the Crazyflie. Actually… Webots’ next release (2022b) will contain a Crazyflie as standard! Once it is out, I’ll write a blogpost about that separately :). As luck has it, Webots also has good ROS2 integration as well, and even won the ‘Best ROS Software’ award by The Construct’s ROS awards! Another reason is that I wanted to try out a different simulator for ROS2 this time to complement what I’ve learned in the ROS2 course I mentioned earlier.

So I used the webots driver node to write a simulated Crazyflie that should output the same information as the real Crazyflie node, so that I can easily hack around and try out different things without constantly disturbing my cats from their slumber :). Anyway, I won’t go into to the simulator too much and save that for another blogpost!

Simple Mapping

I decided to also take another additional step before going full SLAM, which is to make a simple mapper node first! This takes the estimated state estimate of the Crazyflie and the Multiranger’s range values and it creates an occupancy grid type map of it. I do have to give kudos to the Marcus’ cflib Pointcloud script and Webot’s simple mapper example, as I did look at them for some reference. But still with the examples, integration and connecting the dots together is quite some work. Luckily I had the simulator to try things out with!

So first I put the Crazyflie in an apartment simulator, flew around and see if any decent maps comes out of it and it seemed it did! Of course, the simulated Crazyflie’s ‘odometry’ comes from near perfect position estimate, so I didn’t expect any problems there (and in such a situation you would actually not really need the localization part of SLAM). This still needs some improvements to be done, like now range measurements that don’t see anything are excluded from drawing, but still it was pretty cool to map the virtual environment.

So it was off to try it out on a real crazyflie. In one of our meeting rooms, I had one Crazyflie take off, let it turn around with a twist message in a /cmd_vel topic and made a map of the room I was currently in. The effect of the 4 range sensors rotating around and creating a map in one go, makes me think of these retro video transitions. And the odometry drift does not seem as bad for it to be possible, but I haven’t mapped our entire office yet so that might be different!

What’s next?

So I’m not stopping here for sure, I want to extend this functionality further and for sure get it to work with the SLAM_toolbox properly! But if the simple mapper already can produce such quality, I’m pretty sure that this can be done in one way or the other. What I could also do, is first generate a simple map and already have a go at the NAV2 package with that one… there are many roads to Rome here!

Currently I’m doing my work on my personal Github account in the crazyflie_ros2_experimental repository. Everything is still very much in development, hackish and quite specific for one use case but that is expected to change once things are working better, so please check the planning in the project’s readme. In the mean time, you can indicate to us in this vote if this is an interesting direction for us to go towards. Not that it will stop me from continuing this project since it is too much fun, but it is always good to know if certain efforts are appreciated!

Earlier this month, ICRA 2022 was in held in Philadelphia and in person this time! Unfortunately we were unable to attend ourselves but quite happy that there were still virtual attendance options available. So I followed quite some presentations and read through papers, trying to find out the latest in Aerial and Swarm robotics and if anybody was able to use the Crazyflie to good use for their research. I even had the opportunity to attend the Exhibition floor with a telepresence robot, which was a lot of fun!

We have covered IROS 2021 end of last year, and we even have started to publish Crazyflie related publications on social media to keep ourselves and the community up to date with any Crazyflie research work. So here we will list the ICRA 2022 papers we have found and write some observations.

Crazy Platforms

What I really noticed this year is that the Crazyflie has been used in more unconventional configurations and new platforms! IROS 2021 ready amazed us by a solar-powered Crazyflie and the 4 times Crazyflie combined quadcopter (which continued this conference by UCLA in (2). But we haven’t seen yet that a Crazyflie can jump! The PogoDrone by the Swarmslab of Lehigh university turned the Crazyflie into an autonomous jumping pogo stick (5)! Moreover, wheels were added by the Institute For Systems and Robotics (TU Lisbon) for increasing the flight/autonomy durability (7).

We also noticed 3 ICRA 2022 papers with Bolt-powered platforms, which is a huge increase compared to IROS 2021 which only had 1 Bolt entry. The MAVlab of the TU Delft compared the Crazyflie against a Bolt-powered Flapper-drone for flying against wind (see the presentation of Flapperdrone in our last MiniBam). Moreover, remember that saw the Science Robotics paper using a Crazyflie board for a dual wing rotating platform. The Engineering product development of SUTD took a similar design to the next level, building a single controllable rotating wing with a Bolt platform (3). Two of these can even work together cooperatively and fly stability, so it is no wonder that they won the ICRA 2022 Outstanding Dynamics and Control Paper Award.

List of ICRA 2022 Papers featuring the Crazyflie and Bolt

Here is a list of all the Crazyflie/Bolt papers featured in ICRA 2022 but let us know if we are missing any (⚡: Bolt, 🐝: Crazyflie). Mind that only Robotic and Automation Letter entries have been officially published on IEEE Xplore already, so from the proceeding papers I tried to share the ArXiv paper if available.

  1. ⚡ ‘Passive Wall Tracking for a Rotorcraft with Tilted and Ducted Propellers using Proximity Effects’ Ding et al. from City University of Hong Kong & Massachusetts Institute of Technology
  2. 🐝 ‘A Fast and Efficient Attitude Control Algorithm of a Tilt-Rotor Aerial Platform Using Inputs Redundancies’ Su et al. from UCLA
  3. ⚡x2 ‘Cooperative Modular Single Actuator Monocopters Capable of Controlled Passive Separation’, Cai et al. from Singapore University of Technology & Design
  4. 🐝’Optimal Inverted Landing in a Small Aerial Robot with Varied Approach Velocities and Landing Gear Designs’ Habas et al. from Penn State
  5. 🐝 ‘PogoDrone: Design, Model, and Control of a Jumping Quadrotor’, Zhu et al from Lehigh U.
  6. 🐝 ‘Clustering and Informative Path Planning for 3D Gas Distribution Mapping: Algorithms and Performance Evaluation’, Ercolani et al from EPFL
  7. 🐝 ‘A Bimodal Rolling-Flying Robot for Micro Level Inspection of Flat and Inclined Surfaces’ , Pimentel et al from Instituto Superior Tecnico
  8. 🐝x 2 ‘Collision Avoidance for Multiple Quadrotors Using Elastic Safety Clearance Based Model Predictive Control’, Jin et al. from USTC & Sina
  9. 🐝 + ⚡🦋 ‘An Experimental Study of Wind Resistance and Power Consumption in MAVs with a Low-Speed Multi-Fan Wind System’, Olejnik et al. from TU Delft
  10. 🐝x 6 ‘Formation-containment tracking and scaling for multiple quadcopters with an application to choke-point navigation’, Su et al. from The University of Manchester.

Updated

11. 🐝x 6 ‘Nearest-Neighbor-Based Collision Avoidance for Quadrotors Via Reinforcement Learning’, Ourari et al. from TU Darmstadt
ArXiv

12. 🐝x 6 ‘Safe multi-agent motion planning via filtered reinforcement learning’ Vinod et al. from Mitsubishi Electric Research Laboratories
IEEEXplore page

13. 🐝 ‘Event-Triggered Tracking Control Scheme for Quadrotors with External Disturbances: Theory and Validations’, Goa et al. from University of Shanghai for Science and Technology
Outstanding Coordination / Mechanisms & Design / Locomotion / Navigation Award Finalists
IEEEXPlore page

14. 🐝 ‘Watch and Learn: Learning to control feedback linearizable systems from expert demonstrations’, Sultangazin et al. from University of California
IEEEXplore page
15. ‘KoopNet: Joint Learning of Koopman Bilinear Models and Function Dictionaries with Application to Quadrotor Trajectory Tracking’, Folkestad et al. from Caltech
IEEEXplore page

Other Announcements: Bolt 1.1 and Dev meeting

Bolt 1.1

The Bolt is now back in stock and with two small updates making it the Bolt 1.1. Here are the changes listed:

  1. The board thickness has been reduced from 1.6mm to 1.0mm to save some weight, roughly 2 grams. This is handy for the slimmest and most lightweight designs.
  2. Motor signal output M4 has been moved from PB9 to PB10 to be able to support the DSHOT motor signal protocol in the future.

Other then that it is fully backwards compatible but make sure to use a recent enough firmware (2022.03) that has the Bolt 1.1 device support added.

Time and Date for Dev Meeting

In this blogpost we noted that we wanted to organize our first Developer meeting before the summer break. From this poll we saw that most of you that want to attend are currently located in Asia and Australia, so that is why this time we want to organize the meeting at:

13:00 CEST (Sweden time) on Wednesday 22th of June.

The topic will be about our new support platform and support handling in general, so I’m hoping for some fruitful discussions about that. Keep an eye on this discussion thread for any details for joining.