ROS2

As of this year around March/April we started with both Bitcraze developer meetings and Aerial-ROS meetings (the latter in collaboration with Dronecode Foundation). Now that summer is around and our office is a bit empty, we had a bit of a summer break, however we will start the meetings back up again soon! The next ROS-aerial meeting will be on the 16th of August and we will also have a Bitcraze developer meeting planned on the Wednesday the 6th of September (keep an eye on our announcements in discussions). In this blogpost we like to take the opportunity to show an overview of the meetings we had so far.

Aerial ROS meetings

In March we started a [ROS community working group] for aerial Vehicles together with our friends at Dronecode foundation, aka Aerial-ROS! We have biweekly meetings with some standard discussion meetings (with a topic) and with an invited guest presentation.

Here are the discussion topic meetings we had:

And we had several guest speakers as well! Like Miguel Fernandez-Cortizas from CAR-UPM talking about Aerostack2:

ROS-aerial Meeting guest presentation about Aerostack2

Then we had a guest presentation from Gerald Peklar from NXP talking about the Drones4Bats project:

ROS-aerial Meeting guest presentation about Drones4Bats

And the last before the summer was from Alejandro Hernández Cordero (Open Robotics Consultant) about the ROS2 Project Vehicle Gateway.

ROS-aerial Meeting guest presentation about Vehicle Gateway

The next meeting for ROS-aerial is planned on the 16th of August. Keep an eye on the ROS discourse forum.

Bitcraze Developer Meetings

We already had a couple of developer meetings before but we started recording them since April. The first recorded one was about the loco positioning system. Here first we gave a presentation about the system itself, with the latest developments cooking in our pot and time for questions afterwards.

Dev meeting about Loco positioning.

Then we had a meeting about the development of safety features in the Crazyflie in light of the Bolt developments:

Bitcraze Dev meeting about Safety features.

Then we had a meeting where Kristoffer highlighted the autonomous swarm demo we showed at ICRA 2023.

Bitcraze dev meeting about the autonomous swarm demo

And the last before the summer holiday, we had a meeting where Kimberly explained about the Crazyflie simulationmodel intergrated into Webots

Bitcraze Dev meeting about Simulation

We are still planning to have developer meeting every first wednesday of the month starting with September 6th (keep an eye on our announcements in discussions).

EPFL 101 Crazyflie presentation

Oh yeah, by the way, we also were invited by the EPFL-lis lab to give another Crazyflie 101 presentation in Lausanne last April! We made a prerecording of it so you can check it out right here:

EPFL LIS crazyflie 101 presentation.

See you all after summer!

This week’s guest blogpost is from Frederike Dümbgen presenting her latest work from her PhD project at the Laboratory of Audiovisual Communications (LCAV), EPFL, and is currently a Postdoc at the University of Toronto. Enjoy!

Bats navigate using sound. As a matter of fact, the ears of a bat are so much better developed than their eyes that bats cope better with being blindfolded than they cope with their ears being covered. It was precisely this experiment that helped the discovery of echolocation, which is the principle bats use to navigate [1]. Broadly speaking, in echolocation, bats emit ultrasonic chirps and listen for their echos to perceive their surroundings. Since its discovery in the 18th century, astonishing facts about this navigation system have been revealed — for instance, bats vary chirps depending on the task at hand: a chirp that’s good for locating prey might not be good for detecting obstacles and vice versa [2]. Depending on the characteristics of their reflected echos, bats can even classify certain objects — this ability helps them find, for instance, water sources [3]. Wouldn’t it be amazing to harvest these findings in building novel navigation systems for autonomous agents such as drones or cars?

Figure 1: Meet “Crazybat”: the Crazyflie equipped with our custom audio deck including 4 microphones, a buzzer, and a microcontroller. Together, they can be used for bat-like echolocation. The design files and firmware of the audio extension deck are openly available, as is a ROS2-based software stack for audio-based navigation. We hope that fellow researchers can use this as a starting point for further pushing the limits of audio-based navigation in robotics. More details can be found in [4].

The quest for the answer to this question led us — a group of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) — to design the first audio extension deck for the Crazyflie drone, effectively turning it into a “Crazybat” (Figure 1). The Crazybat has four microphones, a simple piezo buzzer, and an additional microprocessor used to extract relevant information from audio data, to be sent to the main processor. All of these additional capabilities are provided by the audio extension deck, for which both the firmware and hardware design files are openly available.1

Video 1: Proof of concept of distance/angle estimation in a semi-static setup. The drone is moved using a stepper motor. More details can be found in [4].

In our paper on the system [4], we show how to use chirps to detect nearby obstacles such as glass walls. Difficult to detect using a laser or cameras, glass walls are excellent sound reflectors and thus a good candidate for audio-based navigation. We show in a first semi-static feasibility study that we can locate the glass wall with centimeter accuracy, even in the presence of loud propeller noise (Video 1). When moving to a flying drone and different kinds of reflectors, the problem becomes significantly more challenging: motion jitter, varying propeller noise and tight real-time constraints make the problem much harder to solve. Nevertheless, first experiments suggest that sound-based wall detection and avoidance is possible (Figure and Video 2).

Video 2: The “Crazybat” drone actively avoiding obstacles based on sound.
Figure 2: Qualitative results of sound-based wall localization on the flying “Crazybat” drone. More details can be found in [4].

The principle we use to make this work is sound-based interference. The sound will “bounce off” the wall, and the reflected and direct sound will interfere either constructively or destructively, depending on the frequency and distance to the wall. Using this same principle for the four microphones, both the angle and the distance of the closest wall can be estimated. This is however not the only way to navigate using sound; in fact, our software stack, available as an open-source package for ROS2, also allows the Crazybat to extract the phase differences of incoming sound at the four microphones, which can be used to determine the location of an external sound source. We believe that a truly intelligent Crazybat would be able to switch between different operating modes depending on the conditions, just like bats that change their chirps depending on the task at hand.

Note that the ROS2 software stack is not limited to the Crazybat only — we have isolated the hardware-dependent components so that the audio-based navigation algorithms can be ported to any platform. As an example, we include results on the small wheeled e-puck2 robot in [4], which shows better performance than the Crazybat thanks to the absence of propeller noise and motion jitter.

This research project has taught us many things, above all an even greater admiration for the abilities of bats! Dealing with sound is pretty hard and very different from other prevalent sensing modalities such as cameras or lasers. Nevertheless, we believe it is an interesting alternative for scenarios with poor eyesight, limited computing power or memory. We hope that other researchers will join us in the quest of exploiting audio for navigation, and we hope that the tools that we make publicly available — both the hardware and software stack — lower the entry barrier for new researchers. 

1 The audio extension deck works in a “plug-and-play” fashion like all other extension decks of the Crazyflie. It has been tested in combination with the flow deck, for stable flight in the absence of a more advanced localization system. The deck performs frequency analysis on incoming raw audio data from the 4 microphones, and sends the relevant information over to the Crazyflie drone where it is converted to the CRTP protocol on a custom driver and sent to the base station for further processing in the ROS2 stack.

References

[1] Galambos, Robert. “The Avoidance of Obstacles by Flying Bats: Spallanzani’s Ideas (1794) and Later Theories.” Isis 34, no. 2 (1942): 132–40. https://doi.org/10.1086/347764.

[2] Fenton, M. Brock, Alan D. Grinnell, Arthur N. Popper, and Richard R. Fay, eds. “Bat Bioacoustics.” In Springer Handbook of Auditory Research, 1992. https://doi.org/10.1007/978-1-4939-3527-7.

[3] Greif, Stefan, and Björn M Siemers. “Innate Recognition of Water Bodies in Echolocating Bats.” Nature Communications 1, no. 106 (2010): 1–6. https://doi.org/10.1038/ncomms1110.

[4] F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield and M. Vetterli, “Blind as a Bat: Audible Echolocation on Small Robots,” in IEEE Robotics and Automation Letters (Early Access), 2022. https://doi.org/10.1109/LRA.2022.3194669.

As you probably noticed already, this summer I experimented with ROS2 and connecting the Crazyflie with multi-ranger to several mapping and navigation nodes (see this and this blogpost). First I started with an experimental repo on my personal Github account called crazyflie_ros2_experimental, where I managed to do some mapping and navigation already. In August we started porting most of this functionality to the crazyswarm2 project, so that is what this blogpost is mostly about.

Crazyswarm goes ROS2

Most of you are already familiar with Crazyswarm for ROS1, which is a project that Wolfgang Hönig and James Preiss have maintained since its creation in 2017 at the University of Southern California. Since then, many have used and referred to this work, since the paper has been cited more than 260 times. From all the Crazyflie papers of the latest ICRA and IROS conferences, 50 % of the papers have used Crazyswarm as their communication middleware. If you haven’t heard about Crazyswarm yet, please check-out the nice BAMdays talk Wolfgang gave last year.

Unfortunately, ROS1 will not be there forever and will be phased out anno 2025 and will not be supported for Ubuntu 22.04 and up. Therefore, Wolfgang, now at the Intelligent Multi-robot Coordination Lab at TU Berlin, has already started with the ROS2 port of Crazyswarm, namely Crazyswarm2. Here the same principle of the C++ based Crazyflie server and the python wrapper were been implemented, along with the simple position based simulation and Teleop nodes. Mind that the name Crazyswarm2 is just the project name out of historic reasons, but the package itself can also be used for individual Crazyflies as well. That is why the package names will be called crazyflie_*

Porting the Summer Hack project to Crazyswarm2

The crazyflie_ros2_experimental was fun to hack around, as it was (as the name suggests) experimental and I didn’t need to worry about releases, bugfixes etc. However, the problem of developing only here, is that the further you go the more work it becomes to make it more official. That is when Wolfgang and I sat down and started talking about porting what I’ve done in the summer into Crazyswarm2. This is also a good opportunity to get more involved with the project, especially with so many Crazyfliers using the ROS as well.

The first step was to write a second crazyflie_server node that relied on the python CFlib. This means that many of the variables I used to hardcode in the experimental node, needed to be defined within the parameter structure of ROS2. The crazyflies.yaml is where anything relevant for the server (like the URIs and parameters) needs to be defined. Both the C++ backend server and the CFlib backend server are using the same parameters. Also the functionality of the both servers are pretty similar, except for that logging is only possible on the CFlib version and uploading/follow trajectories is only possible on the C++ version. An overview will be provided soon on the Crazyswarm2 documentation website.

The second step was to make the crazyflie_server (cflib) node suitable to be connected to external packages that I’ve worked with during the hack project. Therefore, there are some special logging modes, that enables the server to not only output topics based on logging, but Pose/Odometry/LaserScan messages along with Transforms. This allowed the SLAM_toolbox to use the data from the Crazyflie itself to create a map, which you can see an example of in this tutorial.

Moreover, for the navigation it was important that incoming Twist messages either from keyboard or from a navigation toolkit were handled properly. Most of these packages assume a 2D non-holonomic robot, but a quadcopter like the Crazyflie needs to first take off, stay in the air and land. Therefore in the examples, a separate node (vel_mux.py) was written to receive incoming Twist messages, first have the Crazyflie take off in high level commander, and keep sending hover commands to keep it in the air until a land service is called.

What’s next?

As you probably noticed, the project is still under development, but at least it is now at a good state that we feel comfortable to presented at the upcoming ROScon :) We also want to include an more official simulation package, especially now that the Crazyflie has recently became part of the official release of Webots 2022b, but we are currently waiting on the webots_ros2 to be released in the ubuntu packages. Moreover, the idea is to provide multiple simulation backends that based on the requirement of the topic (swarms, vision-based etc), the user can select the simulation most useful for their situation. Also, we would like to even out the missing items (trajectory handling, logging) in both the cflib and cpp backend of the crazyflie_server so that they can be used interchangeably. Also, I saw that the experimental simple mapper node has been featured on social media, so perhaps we should be converting that to Crazyswarm2 as well :)

So once we got the most of the above mentioned issues out the way, that will be the time that we can start discussing the official release of a ROS2 Crazyflie package with its source code residing in the Crazyswarm2 repository. In the meantime, it would be awesome that anybody that is interested in ROS2, or want to soon upgrade their Crazyswarm(1) packages to ROS2 to give the package a whirl. The more people that are trying it out and report bugs/proposing fixes, the more stable it becomes and closer it will come to an official release! Please join us and start any discussions on the Crazyswarm2 project github repository.

Now it is time to give a little update about the ongoing ROS2 related projects. About a month ago we gave you an heads-up about the Summer ROS2 project I was working on, and even though the end goal hasn’t been reached yet, enough has happened in the mean time to write a blogpost about it!

Crazyflie Navigation

Last time showed mostly mapping of a single room, so currently I’m trying to map a bigger portion of the office. This was initially more difficult then initially anticipated, since it worked quite well in simulation, but in real life the multi-ranger deck saw obstacles that weren’t there. Later we found out that was due to this year old issue of the multi-ranger’s driver incapability to handle out-of-range measurements properly (see this ongoing PR). With that, larger scale mapping starts to become possible, which you can see here with the simple mapper node:

If you look at the video until the end, you can notice that the map starts to diverge a bit since the position + orientation is solely based on the flow deck and gyroscopes , which is a big reason to get the SLAM toolbox to work with the multi-ranger. However, it is difficult to combine it with such a sparse ‘Lidar’ , so while that still requires some tuning, I’ve taken this opportunity to see how far I get with the non-slam mapping and the NAV2 package!

As you see from the video, the Crazyflie until the second hallway. Afterwards it was commanded to fly back based on a NAV2 waypoint in RViz2. In the beginning it seemed to do quite well, but around the door of the last room, the Crazyflie got into a bit of trouble. The doorway entrance is already as small as it is, and around that moment is also when the mapping started to diverge, the new map covered the old map, blocking the original pathway back into the room. But still, it came pretty close!

The diverging of the map is currently the blocker for larger office navigation, so it would be nice to get some better localization to work so that the map is not constantly changed due to the divergence of position estimates, but I’m pretty hopeful I’ll be able to figure that out in the next few weeks.

Crazyflie ROS2 node with CrazySwarm2

Based on the poll we set out in the last blogpost, it seemed that many of you were mostly positive for work towards a ROS2 node for the Crazyflie! As some of you know, the Crazyswarm project, that many of you already use for your research, is currently being ported to ROS2 with efforts of Wolfgang Hönig’s IMRCLab with the Crazyswarm2 project. Instead of in parallel creating separate ROS2 nodes and just to add to the confusion for the community, we have decided with Wolfgang to place all of the ROS2 related development into Crazyswarm2. The name of the project will be the same out of historical reasons, but since this is meant to be the standard Crazyflie ROS2 package, the names of each nodes will be more generic upon official release in the future.

To this end, we’ve pushed a cflib python version of the crazyflie ros2 node called crazyflie_server_py, a bit based on my hackish efforts of the crazyflie_ros2_experimental version, such that the users will have a choice of which communication backend to use for the Crazyflie. For now the node simply creates services for each individual Crazyflie and the entire swarm for take_off, land and go_to commands. Next up are logging and parameter handling, positioning support and broadcasting implementation for the CFlib, so please keep an eye on this ticket to see the process.

So hopefully, once the summer project has been completed, I can start porting the navigation capabilities into the the Crazyswarm2 repository with a nice tutorial :)

ROScon talk

As mentioned in a previous blogpost, we’ll actually be talking about the Crazyflie ROS2 efforts at ROScon 2022 in Kyoto in collaboration with Wolfgang. You can find the talk here in the ROScon program, so hopefully I’ll see you at the talk or the week after at IROS!

In the first years that I started at Bitcraze I’ve been focused mostly on embedded development and algorithmic design like the app layer, controllers and estimators and such, however recently I started to be quite interested in the robotic integration between the Crazyflie and other (open-source) projects and users. This means that I’ll be dwelling more often in the space between Bitcraze and the community, which is something that I do really enjoy I noticed during the Grand Tour. It also initiated my work with simulators which I think would be very useful for the community too. The summer fun project that I’ve been now working on is to integrate the Crazyflie with ROS2 to integrate standard navigational packages, which will be the topic of this blogpost!

ROS2 Crazyflie Node

So first I worked on the ROS2 node that actually communicates with the Crazyflie directly. I think many of you are familiar with the USC’s CrazySwarm project, of which the ROS2 variant, CrazySwarm2, is already available for most functionalities. Even though the name says CrazySwarm, this can be very easily used for only one Crazyflie too. The CrazySwam2 is currently under more development by the IMRClab of TU Berlin, but please take a look if you want to give it a go!

For now while Crazyswarm2 is still under development, I used the Bitcraze Crazyflie python library to make a more hackish node that just publishes exactly the information I want. I am focusing on the scenario with the STEM ranging bundle, aka the Crazyflie + Flowdeck (optical flow + distance sensor) + Multi-ranger (5 x distance sensors) combo, where the node logs the multi-ranger data and the odometry from the Flowdeck with the Crazyradio and outputs that into necessary /scan and /odom topics. Moreover, it also outputs several tf2 transforms that makes it possible to either visualize it in RVIZ and/or connect it to any other packages and it should react to incoming twist messages as well.

Development with a Simulator

And of course… I went in head first and connected it directly with the SLAM toolbox. I have worked with ROS1 in the past, but I had my first experience working with that package in the course: Build Mobile Robots with ROS2 (by Weekly Robotic Newsletter’s Mat Sadowski), so I couldn’t wait to try it on a real platform like the Crazyflie. However, tuning this was of course more work than I thought, as the map that I got out of it first was mostly a sparse collection of dots. Of course the SLAM toolbox is meant for lidars and not something that provided sparse range distances like the Multiranger. Then I decided to take one or two steps back, and first connect a simulator to make tuning a bit easier.

Luckily, I’ve already started to look at simulators, and was quite far in the Webots integration of the Crazyflie. Actually… Webots’ next release (2022b) will contain a Crazyflie as standard! Once it is out, I’ll write a blogpost about that separately :). As luck has it, Webots also has good ROS2 integration as well, and even won the ‘Best ROS Software’ award by The Construct’s ROS awards! Another reason is that I wanted to try out a different simulator for ROS2 this time to complement what I’ve learned in the ROS2 course I mentioned earlier.

So I used the webots driver node to write a simulated Crazyflie that should output the same information as the real Crazyflie node, so that I can easily hack around and try out different things without constantly disturbing my cats from their slumber :). Anyway, I won’t go into to the simulator too much and save that for another blogpost!

Simple Mapping

I decided to also take another additional step before going full SLAM, which is to make a simple mapper node first! This takes the estimated state estimate of the Crazyflie and the Multiranger’s range values and it creates an occupancy grid type map of it. I do have to give kudos to the Marcus’ cflib Pointcloud script and Webot’s simple mapper example, as I did look at them for some reference. But still with the examples, integration and connecting the dots together is quite some work. Luckily I had the simulator to try things out with!

So first I put the Crazyflie in an apartment simulator, flew around and see if any decent maps comes out of it and it seemed it did! Of course, the simulated Crazyflie’s ‘odometry’ comes from near perfect position estimate, so I didn’t expect any problems there (and in such a situation you would actually not really need the localization part of SLAM). This still needs some improvements to be done, like now range measurements that don’t see anything are excluded from drawing, but still it was pretty cool to map the virtual environment.

So it was off to try it out on a real crazyflie. In one of our meeting rooms, I had one Crazyflie take off, let it turn around with a twist message in a /cmd_vel topic and made a map of the room I was currently in. The effect of the 4 range sensors rotating around and creating a map in one go, makes me think of these retro video transitions. And the odometry drift does not seem as bad for it to be possible, but I haven’t mapped our entire office yet so that might be different!

What’s next?

So I’m not stopping here for sure, I want to extend this functionality further and for sure get it to work with the SLAM_toolbox properly! But if the simple mapper already can produce such quality, I’m pretty sure that this can be done in one way or the other. What I could also do, is first generate a simple map and already have a go at the NAV2 package with that one… there are many roads to Rome here!

Currently I’m doing my work on my personal Github account in the crazyflie_ros2_experimental repository. Everything is still very much in development, hackish and quite specific for one use case but that is expected to change once things are working better, so please check the planning in the project’s readme. In the mean time, you can indicate to us in this vote if this is an interesting direction for us to go towards. Not that it will stop me from continuing this project since it is too much fun, but it is always good to know if certain efforts are appreciated!