Category: Electronic

Approximately two month ago we wrote a blog post presenting our planned master thesis. Time flies and we have now reached a sufficient state where the results are presentable and possible to use. We have used the Renode framework and created a platform for the Crazyflie 2.1. In Bitcraze’s Github repository there now is a Renode fork with a custom Renode-infrastructure submodule. To get Renode up and running on your computer check the README found there.

On the Renode branch crazyflie there are two new REPL (Renode platform) files describing the platform. An example of the syntax is given below.

// I am a comment
peripheralname: Namespace.ClassName @ parent 0x08000000
    numericConstructorField: 0x100000
    stringConstructorField: "template"
    Interrupt -> interrupthandler@3

In the cf2.repl all the external peripherals are connected while the stm32f405.repl contains the STM32F405 peripherals. Note that only the peripherals used by the current Crazyflie 2.1 have been added since they are the only ones that can be tested using the Crazyflie firmware.

When running Renode a RESC (Renode script) file is loaded. There are currently two RESC files for the Crazyflie, one that only loads the Crazyflie plattform and one that can be used for testing. The one for testing automatically starts the simulation and it also has a hook to exit Renode once the self test has passed.

Successful startup!

As mentioned, Renode is usable both for automatically and interactively testing firmware. The current version of automatic testing is based on the firmware passing the self test. The plan is to incorporate this in the CI pipeline.

When used interactively it is possible to pause the emulation whenever the user wants to, either manually via a Renode command or by connecting to GDB. This allows reading (as well as writing to) memory addresses. Want to read the DMA status at a specific line of code or mess with the system by randomly flipping bits? Doable in the emulator without risking your Crazyflie crashing.

In the platform there also are our customized sensors to which data can be loaded. The data can be loaded either manually or via a file and then sent to the STM32. The scope of this master thesis however has been on firmware testing, not getting a simulated Crazyflie to fly in a virtual environment.

Emulation of hardware is not a trivial task, there are still improvements to be made and the everlasting question whether the emulation actually represents the real system.

One of the Crazyflie features that had to be simplified was the syslink and connection to the nRF microcontroller, which in the emulation simply sends messages back to the STM32. The most exciting part about it currently is how the STM32 receives a signal that no expansion decks are connected via the 1-wire when a scan is executed. Further improvements would be to emulate the radio, power management and support expansion decks, either via an external program or through Renode.

Of course there are other things to improve as well, there will always be someone who thinks of better ways to implement features and only time will tell how this emulator is going to evolve.

Josefine & Max

One crucial aspect of any research and development is to record and analyze data, which then can be used for quantifying performance, debugging strange behavior, or guide us in our decision making. We have been trying to improve the way that this is done to help all of you researcher out there with their work, so this blog post will explain an alternative, better, method to record whatever is happening on the Crazyflie in real-time.

Example data collection of received Lighthouse angles over time. The y-axis contains a unique measurement ID (16 in total for 4 sensors * 2 basestations * 2 sweeps/basestation). Thus, each dot represents the time when a certain kind of measurement was received. We do not receive all angles in fixed intervals, because of the interference between basestations.

Existing Logging Approaches

So far there have been two principle ways of recording data with the Crazyflie:

  1. Logging: In the firmware, one can define global variables that can be streamed out at a fixed frequency, using logging configurations. Variables have a name and data type, and the list of all available variables can be queried from the firmware.
  2. Debug Prints: It is possible to add DEBUG_PRINT(…) in any place that contains a string and possibly some variables. These are asynchronous, but not timestamped, so they are mostly useful to notify the user of some status change.

Both approaches have multiple backends. The logging variables can be streamed over USB, via the Crazyradio, or on a Micro-SD-card using a specialized deck. When using the radio or USB, the frequency is limited to 100 Hz and only a few variables can be streamed at this speed due to bandwidth limitations. The Micro-SD-card deck, on the other hand, allows to log a lot more logging variables at speeds up to 1 kHz, making it possible to access high-speed sensor information, e.g., the IMU data. The Debug Prints can be used over the radio or USB, or with a SEGGER J-Link and a Debug Adapter, over J-Link Real-Time Transfer (RTT). The former is very bandwidth limited, while the latter requires a physical wire connection, making it impossible to use while flying.

New Approach: Event-Based Logging

Event-based logging combines the advantages of the existing approaches and offers a third alternative to record data. Similar to debug prints, a user can trigger an event anywhere in the code and include some mandatory variables that describe this event (so-called payload). Similar to logging, these events are timestamped and have fewer bandwidth limitations. Currently, there is just one backend for the Micro-SD-card deck, but it would be possible to add support for radio and USB as well.

One of our first test cases of the event-based logging is to analyze the data we get from the Lighthouse deck. Here, we trigger an event whenever we received a raw sweep. This allows us to visually see the interference that happens between two LH2 basestations. We also use the event-based logging for time synchronization of the Crazyflie and a motion capture system: when we enable the IR LEDs on the active marker deck, we record an event that contains the Crazyflie timestamp. On the PC side, we record a PC timestamp when the motion capture system first detects the IR LEDs. Clock drifts can be computed by using the same logging mechanism when the IR LEDs are turned off.

Adding event-based logging had some other good side effects as well: the logging is now generally much faster, there is more user feedback about the correct buffer size usage, and the binary files are smaller. More details are in the documentation.

Future Work

We are working on adding event triggers to the state estimator. This will enable us to record all the sensor information during real flights with different positioning systems, so that we can tune and improve the Kalman state estimator. It will also be interesting to add support for events in CRTP, so that it can be used over USB and radio.

Hi!

Since the middle of January, Bitcraze has had two additional guests: us! We, Josefine & Max, are currently doing our master thesis at Bitcraze during our final semester at LTH.

Unfortunately, the pandemic means remote working which could have caused some difficulties with hardware and equipment accessibility. Fortunately, for us, our goal with the thesis is to emulate the Crazyflie 2.1 hardware in the open source software Renode. We can therefore do the work at home.

Since this is the first time either of us have tried to emulate hardware it is exciting to see if it will even be possible, especially as we do not know what limitations Renode might have. Thus far, four weeks in, there have been some hiccups and crashes but also progress and success. One example was when we got the USART up and running and it became possible to start printing debug messages and another was when the LED lights were connected and it was possible to see when they were turned on and off. 

Example of output from partially implemented hardware.

If all goes well, the emulated hardware could be used as a part of the CI pipeline to automatically hunt for bugs. Academically, it would be used to further study testing methods of control firmware.

Getting a glimpse into the workings at Bitcraze and the lovely people working here has been most interesting and we are looking forward to the time ahead. Until next time, hopefully with a working emulation, Josefine & Max.

The lithium polymer battery we use, as with basically all rechargeable batteries, suffers from degradation. That means that when using it, and as time goes by, its energy capacity as well as performance will degrade. The performance, which is very related to the batteries internal resistance, will result in that the Crazyflie will not be able to produce the same maximum thrust and it will not be able to carry as much payload. The loss of capacity is due to ageing and charge cycles, results in that the flight time will decrease. A common solution to monitor the degradation is to have a BMS, or Battery Management System, that constantly monitors the battery health. For the small type of battery that is used in the Crazyflie, this is not yet viable, but maybe there is something we can do to test part of the battery health anyway?

Theory

Since the internal resistance will result in a larger voltage drop during load we can exploit this property and measure it. We will however not only measure the batteries internal resistance but the resistance of the complete power path as a result of the components we have at hand on the Crazyflie.

Power path block diagram

So what we do is to activate the switch (mosfet) so the load (motor) will pull power from the battery. The power drawn will result in a voltage drop compared to a no-load situation, which we can measure and compare to a healthy setup. Since the measurement point is at the PCB traces, any of the components before that point can be causing the voltage drop, however the battery and connector are most likely of doing so as they are most prone to ageing.

Implementation

The load is achieved by, for a very short time, activate the motors at full thrust. We don’t want the Crazyflie to fly away as that would be a bit unhandy. Before activating the motors we measure the idle voltage and during load we measure the minimum voltage so we can calculate the voltage drop. This is pretty easy to do, the problem is to find a good level where we can distinguish a good battery from bad battery. Therefore this feature is pretty experimental. We tested many batteries and good batteries tend to yield a voltage drop between 0.60V – 0.85V while bad batteries go above 1.0V. Therefore the current threshold is set to 0.95V but it would be good to have more data so if you use this feature please give us feedback if the level is wrong. The testing was run on a “stock” setup with the standard battery, propeller and motors, and it is for these the level is set. A different setup will probably not work well and needs a different threshold. Also keep in mind that the connector can also be a “bad” guy as oxide can build up and result in a higher resistance. Often this can be solved with some e.g. WD-40 solvent or un-connecting/connecting the connector several times.

Usage

This is not in the 2021.01 release so one would have to run the latest on the master branch on both the crazyflie-firmware and the Crazyfie-clients-python. The simplest way to test this feature is to launch the cflient, connect to the Crazyflie, open the console tab and press the battery test button.

cfclient console tab after running battery test

When pressing the button the propellers will shortly spin and there will be an output in the console as highlighted in red. If the sag value is below 0.95V it will yield [OK] and if it is above it will say [FAIL].

A probably more useful use case is to test this automatically before taking of with e.g. a swarm. This can be done by setting the parameter health.startBatTest to 1 and after around 0.5s readout the result in the log variable health.batteryPass to check that it is set to 1. The health.batterySag log variable will contain the latest sag (voltage drop) measurement. Hopefully this experimental feature will be a good way of increasing reliability of flights.

It has been a while since we have done an update on the AI-deck! However, as announced in the New Year post, we do have some important changes coming up for the next batch of production. These changes include updated version of the GAP8 chip and switching to a gray-scale camera. We will call this version the AI-deck 1.1, but its firmware should be compatible with the previous version. Currently we don’t have a clear time-line of when we will receive the next batch but it will be soon.

Newer version of GAP8

Together with our collaborators at Greenwaves technologies, we decided to update the GAP8 chip to the latest version available. Since this required a microchip change on the product and to prevent any confusion, we decided to call this version the AIdeck 1.1.

The new version of the GAP8 chip includes several improvements, which resolves 1. conflict problem between uart and hyperbus and 2. Cluster DMA when doing the data transfer between L2 and L1 Memory. The last issue is considered a corner case will only occur if the AI-deck 1.0 is pushed to its limit with a large deep neural network. Although we expect that both issues will not affect the current users if you are experiencing any related problems with the previous version (AI-deck 1.0), please send us an email to contact@bitcraze.io.

Gray-scale Camera

In July we have given an update where we discussed the color camera and how to process it on the GAP8 chip. Even though at the first we were pretty optimistic on the possibilities with the camera, after discussing with the community, ETH zurich and Greenwaves T., we have decided to switch back to a gray-scale camera.

This is because most examples in the gap_sdk repo mostly assume a gray-scale camera. And even though the color camera would be good for image processing with anything of color, it is less ideal for neural networks on edge machine learning devices.

For those who committed themselves to the color camera module, do not worry! We are planning to still offer the color camera module as a separate product in our store! Also we will have a limited amount of grayscale cameras available for those who are unhappy with their AI-decks 1.0’s color camera.

Firmware, Docs and Examples

Since the AI-deck is still in early release, have all the code and documentation in this github repository. This contains all the start-up guides and keeps track of all the bugs and fixes. Some months ago, we managed to update all of this to the latest GAP SDK (3.8.1), which fixed the streamer issues we are having.

However, the quickest way on how to improve this Early Release product is to get feedback by its users. So if you are having any problems at all, do not hesitate to ask a question on to forum or to open up a issue in the repository!

This autumn when we had our quarterly planing meeting, it was obvious that there would not be any conferences this year like other years. This meant we would not meet you, our users and hear about your interesting projects, but also that we would not be forced to create a demo. Sometimes we joke that we are working with Demo Driven Development and that is what is pushing us forward, even-though it is not completely true it is a strong driver. We decided to create a demo in our office and share it online instead, we hope you enjoy it!

The wish list for the demo was long but we decided that we wanted to use multiple positioning technologies, multiple platforms and multiple drones in a swarm. The idea was also to let the needs of the demo drive development of other technologies as well as stabilize existing functionality by “eating our own dogfood”. As a result of the work we have for instance:

  • improved the app layer in the Crazyflie
  • Lighthouse V2 support, including basic support for 2+ base stations
  • better support for mixed positioning systems

First of all, let’s check out the video

We are using our office for the demo and the Crazyflies are essentially flying a fixed trajectory from our meeting room, through the office and kitchen to finally land in the Arena. The Crazyflies are autonomous from the moment they take off and there is no communication with any external computer after that, all positioning is done on-board.

Implementation

The demo is mainly implemented in the Crazyflie as an app with a simple python script on an external machine to start it all. The app is identical in all the Crazyflies so the script tells them where to land and checks that all Crazyflies has found their position before they are started. Finally it tells them to take off one by one with a fixed delay in-between.

The Crazyflie app

When the Crazyflie boots up, the app is started and the first thing it does is to prepare by defining a trajectory in the High Level Commander as well as setting data for the Lighthouse base stations in the system. The app uses a couple of parameters for communication and at this point it is waiting for one of the parameters to be set by the python script.

When the parameter is set, the app uses the High Level Commander to take off and fly to the start point of the trajectory. At the starting point, it kicks off the trajectory and while the High Level Commander handles the flying, the app goes to sleep. When reaching the end of the trajectory, the app once more goes into action and directs the Crazyflie to land at a position set through parameters during the initialization phase.

We used a feature of the High Level Commander that is maybe not that well known but can be very useful to make the motion fluid. When the High Level Commander does a go_to for instance, it plans a trajectory from its current position/velocity/acceleration to the target position in one smooth motion. This can be used when transitioning from a go_to into a trajectory (or from go_to to go_to) by starting the trajectory a little bit too early and thus never stop at the end of the go_to, but “slide” directly into the trajectory. The same technique is used at the end of the trajectory to get out of the way faster to avoid being hit by the next Crazyflie in the swarm.

The trajectory

The main part of the flight is one trajectory handled by the High Level Commander. It is generated using the uav_trajectories project from whoenig. We defined a number of points we wanted the trajectory to pass through and the software generates a list of polynomials that can be used by the High Level Commander. The generated trajectory is passing through the points but as a part of the optimization process it also chooses some (unexpected) curves, but that could be fixed with some tweaking.

The trajectory is defined using absolute positions in a global coordinate system that spans the office.

Positioning

We used three different positioning systems for the demo: the Lighthouse (V2), the Loco Positioning system (TDoA3) and the Flow deck. Different areas of the flight space is covered by different system, either individually or overlapping. All decks are active all the time and pick up data when it is available, pushing it into the extended Kalman estimator.

In the meeting room, where we started, we used two Lighthouse V2 base stations which gave us a very precise position estimate (including yaw) and a good start. When the Crazyflies moved out into the office, they only relied on the Flowdeck and that worked fine even-though the errors potentially builds up over time.

When the Crazyflies turned around the corner into the hallway towards the kitchen, we saw that the errors some times were too large, either the position or yaw was off which caused the Crazyflies to hit a wall. To fix that, we added 4 LPS nodes in the hallway and this solved the problem. Note that all the 4 anchors are on the ground and that it is not enough to give the Crazyflie a good 3D position, but the distance sensor on the Flow deck provides Z-information and the overall result is good.

The corner when going from the kitchen into the Arena is pretty tight and again the build up of errors made it problematic to rely on the Flow deck only, so we added a lighthouse base station for extra help.

Finally, in the first part of the Arena, the LPS system has full 3D coverage and together with the Flow deck it is smooth sailing. About half way the Crazyflies started to pick up the Lighthouse system as well and we are now using data from all three systems at the same time.

Obviously we were using more than 2 basestations with the Lighthouse system and even though it is not officially supported, it worked with some care and manual labor. The geometry data was for instance manually tweaked to fit the global coordinate system.

The wall between the kitchen and the Arena is very thick and it is unlikely that UWB can go through it, but we still got LPS data from the Arena anchors occasionally. Our interpretation is that it must have been packets bouncing on the walls into the kitchen. The stray packets were picked up by the Crazyflies but since the Lighthouse base station provided a strong information source, the LPS packets did not cause any problems.

Firmware modifications

The firmware is essentially the stock crazyflie-firmware from Github, however we did make a few alterations though:

  • The maximum velocity of the PID controller was increased to make it possible to fly a bit faster and create a nicer demo.
  • The number of lighthouse base stations was increased
  • The PID controller was tweaked for the Bolt

You can find the source code for the demo on github. The important stuff is in examples/demos/hyperdemo

The hardware

In the demo we used 5 x Crazyflie 2.1 and 1 x Bolt very similar to the Li-Ion Bolt we built recently. The difference is that this version used a 2-cell Li-Po and lower KV motors but the Li-Ion Bolt would have worked just as well.

Hyperdemo drones and they configurations

To make all positioning to work at the same time we needed to add 3 decks, Lighthouse, Flow v2 and Loco-deck. On the Crazyflie 2.1 this fits if the extra long pin-headers are used and the Lighthouse is mounted on top and the Loco-deck underneath the Crazyflie 2.1 with the Flow v2 on the bottom. The same goes for the Bolt, but here we had to solder the extra long pin-header and the long pin-header together to make them long enough.

There is one catch though… the pin resources for the decks collide. With some patching of the loco-deck this can be mitigated by moving its IRQ to IO_2 using the solder-jumper. The RST needs to be moved to IO_4 which requires a small patch wire.

Also some FW configuration is needed which is added to the hyperdemo makefile:

CFLAGS += -DLOCODECK_USE_ALT_PINS
CFLAGS += -DLOCODECK_ALT_PIN_RESET=DECK_GPIO_IO4

The final weight for the Crazyflie 2.1 is on the heavy side and we quickly discover that fully charged batteries should be used or else the crash probability is increased a lot.

Conclusions

We’re happy we were able to set this demo up and that it was fairly straight forward. The whole setup of it was done in one or two days. The App layer is quite useful and we tend to use it quite often when trying out ideas, which we interpret as a good sign :-)

We are satisfied with the results and hope it will inspire some of you out there to push the limits even further!

Li-Ion batteries have packed more energy per gram for a long time compared to Li-Po batteries. The problem for UAV applications has been that Li-Ion can’t deliver enough current, something that is starting to change. Now there are cells that are supposed to be able to deliver 30-35A continuously in the 18650 series, at least according to the specs. Therefor we thought it was time to do some testing and decided to build a 1 cell Li-Ion drone using the Crazyflie Bolt as base.

Since a 18650 battery is 18mm in diameter and 65mm long, the size would affect the design but we still wanted to keep the drone small and lightweight. The battery is below 20mm wide which means we can run the deck connectors around it, that is nice. We chose to use our 3D printer to build the frame and use off the shelf ESCs, motors and props. After a couple of hours of research we selected 3″ propellers, 1202.5 11500kv motors and tiny 1-2s single ESCs for our first prototype.

Parts list:

  • 1 x Custom designed 130mm 3D printed frame
  • 1 x Crazyflie Bolt flight controller
  • 4 x Eachine 3020 propeller (2xCW + 2xCCW)
  • 4 x Flywoo ROBO RB 1202.5 11500 Kv motors
  • 4 x Flash hobby 7A 1-2S ESC
  • 1 x Li-Ion Sony 18650 VTC6 3000mAh 30A
  • Screws, anti vib. spacers, zipties, etc.

The custom designed frame was developed in iterations, and can still be much improved, but at this stage it is small, lightweight and rigid enough. We wanted the battery to be as central as possible while keeping it all compact.

Prototype frame designed in FreeCAD.

Assembly and tuning

The 3D printed frame came out quite well and weighed in at 13g. After soldering the bolt connectors to the ESCs, attaching motors and props, adjusting battery cable and soldering a XT30 to the Li-Ion battery it all weighed ~103g and then the battery is 45g of these. It feels quite heavy compared to the Crazyflie 2.1 and we had a lot of respect when we test flew it the first time. Before we took off we reduced the pitch and roll PID gains to roughly half and luckily it flew without problems and quite nicely. Well it sounds a lot but that is kind of expected. After increasing the gains a bit we felt quite pleased with:

#define PID_ROLL_RATE_KP  70.0
#define PID_ROLL_RATE_KI  200.0
#define PID_ROLL_RATE_KD  2
#define PID_ROLL_RATE_INTEGRATION_LIMIT    33.3

#define PID_PITCH_RATE_KP  70.0
#define PID_PITCH_RATE_KI  200.0
#define PID_PITCH_RATE_KD  2
#define PID_PITCH_RATE_INTEGRATION_LIMIT   33.3

#define PID_ROLL_KP  7.0
#define PID_ROLL_KI  3.0
#define PID_ROLL_KD  0.0
#define PID_ROLL_INTEGRATION_LIMIT    20.0

#define PID_PITCH_KP  7.0
#define PID_PITCH_KI  3.0
#define PID_PITCH_KD  0.0
#define PID_PITCH_INTEGRATION_LIMIT   20.0

This would be good enough for what we really wanted to try, the endurance with a Li-Ion battery. A quick measurement of the current consumption at hover, 5.8A, we estimated up to ~30 min flight time on a 3000mAh Li-Ion battery, wow, but first a real test…

Hover test

For the hover test we used lighthouse 2 which is starting to work quite well. We had to change the weight and thrust constants in estimator_kalman.c for the autonomous flight to work:

#define CRAZYFLIE_WEIGHT_grams (100.0f)

//thrust is thrust mapped for 65536 <==> 250 GRAMS!
#define CONTROL_TO_ACC (GRAVITY_MAGNITUDE*250.0f/CRAZYFLIE_WEIGHT_grams/65536.0f)

After doing that and creating a hover script that hovers at 0.5m height and was set to land when the voltage reached 3.0V. We leaned back with excitement, behind a safety net, and started the script… after 19 min it landed… good but not what we hoped for and quite far from the calculated 30 min. Maybe Li-Ion isn’t that good when it needs to provide more current…? A quick internet search and we could find that Li-Ion can run all the way down to 2.5V, but we have to stop at 3.0V because of electronics and loosing thrust, so we are missing quite a bit of energy… Further investigations are needed.

Lighthouse 2 flight test

As a final test we launched some flight scripts to fly in a square and in a spiral so we would get a feel for Lighthouse 2 + Bolt with PID controller combination. We think it turned out quite nicely, and this with almost no optimization effort:

Summary

Li-Ion felt like it could be a game changer when it comes to flight time but was not as promising as we hoped for. It doesn’t mean we can’t get there though. More research and development is required.

I am back from parental leave and during this time I tried not to think too much about Crazyflie-related things to get a little break. However, over time, while geeking around, I eventually ended-up back to Crazyflie and Crazyradio designing a new channel-hopping communication protocol. This will likely be the subject of a future blog post but for the time being I thought I could write a bit about how the current Crazyflie radio communication is working.

Protocol layers

Like Many protocols, the Crazyflie communication protocol is layered. This allows to plug different elements at each level. When it comes to a Crazyflie client talking to a Crazyflie over the radio, the layering looks as follow:

Services are high-level functionalities like log that allows to get values of Crazyflie variables at regular interval. At this level there is essencially an API with commands like addLogBlock.

CRTP is a protocol that encapsulates the commands for each sevices. It multiplexes packets on the link using port numbers, this is very similar to TCP/UDP port on a network, each service is listening and sending packet on a pre-specified port.

Finally the radio link only deals with raw packets. The role of the radio link is to deliver packets from the PC to the Crazyflie and vice-versa. At this level, we have many link implementations, the most used are the radio and the USB link but there also is a Serial link that uses the Crazyflie serial port.

Radio link

The radio link is currently implemented by the Crazyradio (PA) on the PC side and the Crazyflie on the other side. The Crazyradio uses a nordic semiconductor nRF24LU1 USB/Radio chip and the Crazyflie a nRF51822 MCU/Radio. This is importance since, while the nRF51 has a quite flexible radio, the nRF24 does use a standalone SPI radio that has most of the packet handling hard-coded to a protocol that nordic call Enhanced Shockburst (ESB).

The ESB protocol handles sending packet and receiving acknowledgement automatically. A packet is sent on a radio channel, using a 5 bytes address, and when this packet is received on the other side an acknowledgement is sent back using the same address on the same channel. Both the original packet and the acknowledgement can contain a payload.

To implement a bi-directional radio link, the crazyradio is the one sending packets and the Crazyflies continuously listens and, when receiving a packet, sends back an ack. We do use the payload capabilities of both packet we send and of the ack to implement an uplink (PC->Crazyflie) and downlink (Crazyflie->PC) data link.

Of course, one problem with this setup is that while the PC can decide to send a packet at any moment, the Crazyflie needs to wait for the PC to send a packet to have a chance to send one back in an ACK. To make sure the Crazyflie has enough opportunity to send packets back, we are sending packets regular interval to the Crazyflie even if there is no packet to be sent. This polling allows to implement a continuous downlink.

The most important to see here is that the radio link gives to the upper layer, the CRTP layer, the illusion of a full duplex link. On the radio side this is implemented by polling regularly for downlink packets.

Communicating with multiple Crazyflies

In order to communicate with multiple Crazyflie, we just send packet to each Crazyflies one after each-other. This way we give equal chance for each Crazyflie to send back packets and doing so we divide the available bandwidth between them.

The main advantage of the polling protocol versus a more traditional P2P protocol where the Crazyflies would send when they want, is that when using polling the Crazyradio is the master and we can guarantee that we are not introducing any packet collision when we communicate.

Limitation and future

One major limitation of the current protocol is that it communicates on a single channel and requires the user to set manually channel and address for each Crazyflies. This means that the user has to tinker with parameters to find a good channel and has to manually handle all addressing.

Another limitation is that the polling is done over USB by python or, in case of ROS/Crazyswarm, in C++. This adds the USB latency to the equation and complexifies the client implementation.

I have been working on defining a new protocol that would be implemented efficiently in a Crazyradio and that would implement addressing and channel hopping. The idea is to get closer to a connection style more like bluetooth low energy where you do not have to care about channels and setting address, you just connect your device. Unlike BLE though, this protocol will be optimized for low latency. Stay tuned, we will likely talk about that more in future posts :-).

It has been about a month since the AI-deck became available in Early Access. Since then there are now quite a few of you that own an AI-deck yourself. A new development we would like to share: we thought before that we had selected a gray-scale image sensor. However, it came to our attention that the camera actually contains a color image sensor, which on second viewing of the video presented in this blogpost is pretty obvious in hindsight (thanks PULP project ETH Zurich for letting us know!).

A color image from the AI-deck

This came as a little surprise, but a color camera can also add some new possibilities, like making the Crazyflie follow a orange ball, or also train the CNNs incorporate color in their classification training as well. The only thing is that it will require an extra preprocessing task in order to retrieve the color image, which will be explained in the next section.

Demosaicing

Essentially all CMOS image sensors are gray-scale by definition. In order to retrieve color from a scene, manufacturers add a Bayer filter on top of the image sensor, so it filters out the red, green and blue on each pixels. This Color filter array does not need to be RGB, but all kinds of colors, but we will only talk about the Bayer filter. If the pattern of the filter is known, the pixels that related to a certain color will be interpolated with each-other in order to fill in the gaps in between. This process is called demosaicing and it creates the RGB channels that are converted to a color image.

Process of demosaicing with a Bayer filter

Currently we only implemented a simple nearest-neighbor interpolation scheme for demosaicing, which is fine for demonstration purposes, however is not the best technique out there. Such a simple interpolation is not very ‘edge and detail’ aware and can therefore cause artifacts, like these Moiré effects seen here below. Anyway, we are still experimenting how to get a better image and how to translate that to all the examples of the AI-deck example repository (see this issue if you would like to follow or take part in the discussion).

Moiré effect

So technically, once we have the color image, this can be converted to a gray-scale images which can be used for the examples as is. However, there is a reduction in quality since the full pixel resolution is not used for obtaining the full scale image. We are currently discussing if it would be useful to get the gray-scale version of this camera and make this available as well, so let us know if you would be interested!

Feedback and Early Access

Like we said before, there now quite a few of you out there that have an AI-deck in their procession. As it is in Early Access, the software part is still in full development. However, since we have not received any negative feedback of you, we believe that everything is fine and peachy!

Just kidding ;) we know that the AI-deck is quite a challenging deck to work with and we know for sure that many of you probably have questions or have something to say about working with it. Buying an Early Access product also comes with a little bit of responsibility. The more feedback we get from you guys, the more we can tailor the software and support to help you and others, thereby advancing the product forward and getting it out of the early access phase.

So please, let us know if you are having any trouble starting up by posting a thread on the forum (we have a special AI-deck group!). If there are any issues with the examples or the documentation of the AI-deck repo. We and also our collaborators at Greenwaves Technologies (from the GAP8 chip) are more than happy to help out. That is what we are here for :)

The AI-deck is now available in our online store! Super-edge-computing is now possible on your Crazyflie thanks to the GAP8 IoT application processor from GreenWaves Technologies. GAP8 delivers over 10 GOPS of compute power at exceptionally low power consumption enabling complex tasks such as path-finding and target following on the Crazyflie, consuming less than 0,1% of the total energy.

The AI-deck can host artificial intelligence-based workloads like Convolutional Neural Networks onboard. This will open up many research areas focusing on fully onboard autonomous navigation of tiny MAVs, like ETH Zurich’s PULP-Dronet. Moreover, there is also ESP32 WiFi connectivity with the possibility to stream the images to your personal computer.

We are happy that we managed to get everything ready so soon after our last update. Crazyflie AI-deck is in early access, which means the hardware design is now finalized and full support for building, running and debugging applications (including GreenWaves’ GAPflow tools for porting neural networks to GAP from TensorFlow) is available, however, limited examples of specific AI-deck applications have been developed so far. Read more about early access here. Even though there is still some work to be done, there are already some examples you can try out which we will explain in this blog post. Also, we aim to have all AI-decks pre-flashed with the WiFi streamer example so that you check out right away if your AI-deck is working.

Beware that you need an JTAG-enabled programmer/debugger in order to develop for the AI-deck!

Technical specifications

The AI-deck will come with two elongated male pin-headers, which enables the user to connect it to the Crazyflie with an additional deck. There are two 10 pin JTAG connectors soldered which enables connection with a JTAG-enabled programmer. This will be the main way to program the GAP8 chip and the ESP-based NINA module while it is still in early access.

Getting started

When you first receive your AI-deck, it should be flashed with a WiFi streamer example of the camera image stream. Once the AI-deck is powered up by the Crazyflie, it will automatically create a hotspot called ‘Bitcraze AI-deck Example’. In this repo in the folder named ‘NINA’ you will find a file called viewer.py. If you run this with python (preferably version 3), you will be able to see the camera image stream on your computer. This will confirm that your AI-deck is working.

Next step is to go to the docs folder of the AI-deck examples repository. Try out the WiFi demo and set up your development program with the getting-started guide. This guide contains links to the GAP-SDK documentation from GreenWaves Technologies. You can read more about the face detector example that we demonstrated in this blog post.