autonomy

As of this year around March/April we started with both Bitcraze developer meetings and Aerial-ROS meetings (the latter in collaboration with Dronecode Foundation). Now that summer is around and our office is a bit empty, we had a bit of a summer break, however we will start the meetings back up again soon! The next ROS-aerial meeting will be on the 16th of August and we will also have a Bitcraze developer meeting planned on the Wednesday the 6th of September (keep an eye on our announcements in discussions). In this blogpost we like to take the opportunity to show an overview of the meetings we had so far.

Aerial ROS meetings

In March we started a [ROS community working group] for aerial Vehicles together with our friends at Dronecode foundation, aka Aerial-ROS! We have biweekly meetings with some standard discussion meetings (with a topic) and with an invited guest presentation.

Here are the discussion topic meetings we had:

And we had several guest speakers as well! Like Miguel Fernandez-Cortizas from CAR-UPM talking about Aerostack2:

ROS-aerial Meeting guest presentation about Aerostack2

Then we had a guest presentation from Gerald Peklar from NXP talking about the Drones4Bats project:

ROS-aerial Meeting guest presentation about Drones4Bats

And the last before the summer was from Alejandro Hernández Cordero (Open Robotics Consultant) about the ROS2 Project Vehicle Gateway.

ROS-aerial Meeting guest presentation about Vehicle Gateway

The next meeting for ROS-aerial is planned on the 16th of August. Keep an eye on the ROS discourse forum.

Bitcraze Developer Meetings

We already had a couple of developer meetings before but we started recording them since April. The first recorded one was about the loco positioning system. Here first we gave a presentation about the system itself, with the latest developments cooking in our pot and time for questions afterwards.

Dev meeting about Loco positioning.

Then we had a meeting about the development of safety features in the Crazyflie in light of the Bolt developments:

Bitcraze Dev meeting about Safety features.

Then we had a meeting where Kristoffer highlighted the autonomous swarm demo we showed at ICRA 2023.

Bitcraze dev meeting about the autonomous swarm demo

And the last before the summer holiday, we had a meeting where Kimberly explained about the Crazyflie simulationmodel intergrated into Webots

Bitcraze Dev meeting about Simulation

We are still planning to have developer meeting every first wednesday of the month starting with September 6th (keep an eye on our announcements in discussions).

EPFL 101 Crazyflie presentation

Oh yeah, by the way, we also were invited by the EPFL-lis lab to give another Crazyflie 101 presentation in Lausanne last April! We made a prerecording of it so you can check it out right here:

EPFL LIS crazyflie 101 presentation.

See you all after summer!

This weeks guest blog post is from Hanna Müller, Vlad Niculescu and Tommaso Polonelli, who are working with Luca Benini at the Integrated Systems Lab and Michele Magno at the Center for Project-Based Learning, both at ETH Zürich. Enjoy!

This blog post will give you some insight into our current work towards autonomous flight on nano-drones using a miniaturized multi-zone depth sensor. Here we will mainly talk about obstacle avoidance, as it is our first building block towards fully autonomous navigation. Who knows, maybe in the future, we will have the honor to write another blog post about localization and mapping ;)

A Crazyflie 2.1 with our custom multi-zone ToF deck, a flow deck and a vicon marker.

Obstacle avoidance on nano-drones is challenging, as the restricted payload limits on-board sensors and computational power. Most approaches, therefore, use lightweight and ultra-low-power monocular cameras (as the AI-deck) or 1d depth sensors (as the multi-ranger deck). However, both those approaches have drawbacks – the camera images need extensive processing, usually even neural networks to detect obstacles. Neural networks additionally need training data and are prone to fail in completely new scenarios. The 1d depth sensors can reliably detect obstacles in their field of view (FoV); however, no information about the size or exact position of the obstacle is obtained.


On bigger drones, usually lidars or radars are used, but unfortunately, due to the limited weight and power consumption, those cannot be carried and used on nano-drones. However, in 2021 STMicroelectronics introduced a new multi-zone Time-of-Flight (ToF) sensor – with maximal 8×8 pixel resolution, a range up to 4m (according to the datasheet), a small form-factor and low power consumption of only 286mW (typical) it is ideal to use on nano-drones.


In the picture on top, you can see the Crazyflie 2.1 with our custom ToF deck (open-sourced at https://github.com/ETH-PBL/Matrix_ToF_Drones). We described this deck for the first time in [1], together with a sensor characterization. From this, we saw that we could use the sensor in different light conditions and on different colored obstacles, but from 2m on, the measurements started to get incomplete in all scenarios. However, as the sensor can detect invalid measurements (due to interference or obstacles being out of range), we can still rely on our information. In [2], we presented the system and some steps towards obstacle avoidance in a demo abstract, as you can see in the video below:

The next thing we did was to collect a dataset – we flew with different combinations of decks (flow-deck v2, AI-deck, our custom multi-zone ToF deck) and sometimes even tracked by a vicon system. Those recordings amount to an extensive dataset with depth images, RGB images, internal state estimation and the position and attitude ground truth.


We then fed the recorded data into a python simulation to develop an obstacle avoidance algorithm. We focused on only the ToF data (we are not fusing with the camera in this project, we just provide the data for future work). We aimed for a very efficient solution – because we want it to run on-board, on the STM32F405, with low latency and without occupying too many resources. Our algorithm is very lightweight but highly effective – we divide the FoV in different zones, according to how dangerous obstacles in those areas are and then use a decision tree to decide on a steering angle and velocity.


With only using up 0.31% of the computational power and 210 μs latency, we reached our goal of developing an efficient obstacle avoidance algorithm. Our system is also low-power, the power to lift the additional sensor with all accompanying electronics as well as the supply of it totals in less than 10% of the whole drone. On average, our system reaches a flight time of around 7 minutes. We refer to our preprint [3] for details on our various tests – they include flights with distances up to 212 m and 100% reliability and high agility at a low speed in an office environment.

As our paper is currently submitted but not yet accepted our code and dataset are not yet released – however, the hardware design is already accessible: https://github.com/ETH-PBL/Matrix_ToF_Drones

[1] V. Niculescu, H. Müller, I. Ostovar, T. Polonelli, M. Magno and L. Benini, “Towards a Multi-Pixel Time-of-Flight Indoor Navigation System for Nano-Drone Applications,” 2022 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), 2022, pp. 1-6, doi: 10.1109/I2MTC48687.2022.9806701.
[2] I. Ostovar, V. Niculescu, H. Müller, T. Polonelli, M. Magno and L. Benini, “Demo Abstract: Towards Reliable Obstacle Avoidance for Nano-UAVs,” 2022 21st ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), 2022, pp. 501-502, doi: 10.1109/IPSN54338.2022.00051.
[3] H.Müller, V. Niculescu, T. Polonelli, M. Magno and L. Benini “Robust and Efficient Depth-based Obstacle Avoidance for Autonomous Miniaturized UAVs”, submitted to IEEE, preprint: https://arxiv.org/abs/2208.12624

Before the summer vacations, I had the opportunity to spend some time working on AI deck improvements (blog post). One of the goals I set was to get CRTP over WiFi working, and try to fix issues along the way. The idea was to put together a small example where you could fly the Crazyflie using the keyboard and see the streamed image along the way. This would require both CRTP to the Crazyflie (logging and commands) as well as CPX to the GAP8 for the images. Just before heading off to vacation I managed to get the demo working, this post is about the results and som of the things that changed.

Link drivers

When using the Crazyflie Python library you connect to a Crazyflie using a URI. The first part of the URI (i.e radio or usb) selects what link driver to use for the connection. For example radio://0/80/2M/E7E7E7E7E7 selects the radio link driver, USB dongle 0 and communication at 2Mbit on channel E7E7E7E7E7.

While working on this demo there were two major things changed in the link drivers. The first one was the implementation of the serial link (serial://) which is now using CPX for CRTP to the Crazyflie. The usecase for this link driver is to connect a Raspberry Pi via a serial port to the Crazyflie on a larger platform.

The second change was to add a new link driver for connecting to the Crazyflie via TCP. Using this link driver it’s possible to connect to the Crazyflie via the network. It’s also possible to get the underlying protocol, the CPX object, for using CPX directly. This is used for communicating with for example the GAP8 to get images.

In the new TCP link driver the URI starts with tcp:// and has either an IP or a host name, followed by the port. Here’s two examples:

  • tcp://aideck-AABBCCDD.local:5000
  • tcp://196.168.0.100:5000

Comparison with the Crazyradio PA

So can WiFi be used now instead of the Crazyradio PA? Well, it depends. Using WiFi will give you larger throughput but you will trade this for latency. In our tests the latency is both larger and very random. In the demo I fly with the Flow V2 deck, which means latency isn’t that much of an issue. But if you were to fly without positioning and just use a joystick, this would not work out.

The Demo!

Below is a video of some flying at our office, to try it out yourself have a look at the example code here. Although the demo was mostly intended for improving CPX, we’ve made use of it at the office to collect training data for the AI deck.

The Crazyflie with AIdeck during over WiFI controlled flight.

Improvements

Unfortunately I was a bit short on time and the changes for mDNS discovery never made it it. Because of this there’s no way to “scan” or discover AI decks, so to connect you will need to know the IP or the host name. For now you can retrieve that by connecting to your AI-deck equipped Crazyflie with the CFclient and look at the console tab.

A part from that there’s more improvements to be made, with a better structure for using CPX (more like the CRTP stack with functions) in the library and more examples. There’s also still a few bugs to iron out, for example there’s still the improved FPS and WiFi throughput issues.

IMAV 2022

Next week from 13th to 16th of September Barbara, Kristoffer and Kimberly will be present at the international Micro Aerial Vehicle Conference and Competition (IMAV) hosted by the MAVlab of the TU Delft in the Netherlands. One of the competitions is called the nano quadcopter challenge, where teams will program a Crazyflie + AI deck combo to navigate through an obstacle field, so we are excited to see what solutions will come out of that. If any of you happens to be at the conference/competition, drop by our table to say hello!