Category: Frontpage

Happy holidays to all our users, community members and friends! We are happy to announce our 2019 Christmas video which we have made in collaboration with Ben Kuper! It is starring 7 Crazyflies, the lighthouse positioning system, our office Christmas tree and a whole lot of holiday spirit, so go ahead and take a look!

Here are some words from Ben how it was to work on this year’s Christmas video at our office:

Coming to Bitcraze’s HQ and working with them has been once more a wonderful experience, technically and humanly ! The main goal of this session was to test and implement the new lighthouse tracking system in the tool suite I’m creating, and it was an amazing surprise to witness for real the uncanny stability of the drones when they’re on lighthouse tracking !

Of course, my first reaction was to push the limit and see what can be done with this new power, this is why I created this choreography : to see what can be done in a limited amount of time (1 and a half day to create the full choreography, the official video shows the first part only), and trying to go at the limit of the current possibilities. As the team was working on occlusion recovery, we decided to have the drone fly around the tree as a fun test, and it works !

In the new year we will have a followup blog-post going into detail on how exactly we made this video. Until then, happy holidays and have an awesome new year!

We are currently finishing production test design for a couple of expansion decks and we figured we never wrote about it and about the more general board production process. In this blog post we wanted to talk a bit about how we test boards in the productions phase, taking as an example the forthcoming active marker deck.

The active marker deck

When finalizing an electronic board, we send to the manufacturer documentation that allows to manufacture & assemble a, hopefully, functional board. Although we assume that the individual components are in working order, the problem is that the assembling is not always perfect, so we need to check that everything we do is actually working,. This is what the production test is solving.

The first thing is to find out what to test, for that we need a strategy. The strategy we have been using is to test every step where we have modified or work on: for example we will test all the connections we have soldered in the manufacturing process. We will normally not test all the functionalities of ready-made module. For example, following this strategy, we will usually test all communication interface we have cabled, but we will not test all functionalities of a microcontroller we solder on the board, these are deemed to be already tested and working by the microcontroller manufacturer. This step usually end up with an annotated schematic:

Annoted schematics of ActiveMarker Deck

Once we know what to test and roughly how to test it, we document a test rig that will be able to run the tests automatically. Some tests are generic and applicable to all our boards, for example we do test voltages with a multi-meter on every board that has a regulator. Some tests are very board specific. For example, for the active marker deck we want to test IR LEDs and an IR detector, we define a test rig that has reflector to reflect the LED to the detector and we will use the onboard detector to test the LEDs:

Simple block diagram of the test rig for the ActiveMarker Deck

We are normally using a Crazyflie on all our test rig, since it is usually possible to test all functionality from the deck port. We also try as much as possible to integrate the test software into the real software. For the active marker deck it meant adding 38KHz modulated output mode to the LEDs in order to emit a signal detectable by the detector, which will make it to the final firmware. Finally, we have a test software, running on the test computer, that uses the Crazyflie python lib to talk to the Crazyflie and run the test. The last step of all the test is to write the deck One Wire identification memory so that it can be detected by a Crazyflie.

Screenshot of the test program for the test engineer

From these specification, the manufacturer can then build a test rig and start testing boards, non-passing board will be re-worked until they pass or discarded.

Test rig for the Multi-ranger expansion deck

What we have learned in our years at Bitcraze is that testing phase is the most important part of the development process of PCB. Therefore, the earliest we already start thinking about the production tests in the board design, the more smooth the final phase of production of our new products will be.

After a couple of delays we are happy to announce the Crazyflie Bolt is now stocked and ready to ship out. For those of you that are new to the Bolt, it is basically a Crazyflie 2.1 control board, but built to fit a bigger package. We have blogged about it a couple of times before, so if you would like to catch up you can start from the first idea, to maturing and finally changing name from RZR to Bolt. Another way to describe the Bolt is: Crazyflie 2.1 + Big-quad deck in one which doesn’t hog any deck expansion pins. Thus combinations such as Bolt + Led-ring + Lighthouse-4 is now possible or e.g. Bolt + Flow v2 + LPS.

Keep in mind that the Bolt is an early access product so you will most likely have to dig in to the code to hard-code PID-tuning parameters etc. Also trowing a warning finger, heavier drones can be very dangerous so be sure to keep safe!

The Crazyflie Bolt is delivered as a stand alone control board. Frame, motors, propellers and battery needs to be added, for details check out the wiki. Unfortunately we don’t have a good reference kit to recommend at the moment. If you happen to have built a good one, please share.

This week we have a guest blog post from Joseph La Delfa.

DroneChi is a Human Drone interaction experience that uses the Qualisys motion capture system that enables the Crazyflie to react to movements of your body. At the Exertion Games Lab in Melbourne Australia, we like to design new experiences with technology where the whole body can be the controller and is involved in the experience.

When we first put these two technologies together we realised two things. 

  1. It was super easy to keep your attention on a the drone as it flew around the room reacting to your movements. 
  2. As a result it was also really easy to reflect on and refine ones own movements. 

We thought this was like meditation meditated by a drone, and wanted to investigate how to further enhance this experience through design. We thought the smooth movements were especially mesmerising and so I decided to take beginner Tai Chi lessons; to get an appreciation of what it felt like to move like a Tai Chi student.

We undertook an 8 month design program where we simultaneously designed the form and the interaction of the Crazyflie. The initial design brief was pretty simple, make it look and feel light, graceful and from nature. In Tai Chi you are asked all the time to imagine a flower, the sea or a bird as you embody its movements, we wanted to emulate these experiences but without verbal instruction. Could a drone facilitate these sorts of experiences through it’s design?

We will present a summarised version of how the form and the interaction came about. Starting with a mood board, we collated radially symmetrical forms from nature to match a drone’s natural weight distribution.

We initially went with a jelly fish, hoping to emulate their “push gliiide” movement by articulating laser cut silhouettes (see fig c). This proved incredibly difficult, after searching high and low for a foam that was light enough for the Crazyflie to lift, we just could not get it to fly stable. 

However, we serendipitously fell into the flower shape by trying to improve how we joined the carbon rods together in a loop (fig b below).  By joining them to the main hull we realised it looked like a petal! This set us down the path of the flower, we even flipped the chassis so that the LED ring faced upwards (cheers to Tobias for that firmware hack). 

Whilst this was going on we were experimenting with how to actually interact with the drone. Considering the experience was to be demonstrated at a major conference we decided to keep the tracking only to the hands, this allowed quick change overs. We started with cardboard pads, experimented with gloves but settled on some floral inspired 3D printed pads. We were so tempted to include the articulation of the fingers but decided against it to avoid scope creep! Further to this, we curved the final hand pads (fig  d) to promote the idea of holding the drone, inspired by a move in Tai Chi called “holding the ball”.

As a beginner practicing Tai Chi I was sometimes overwhelmed by the number of aspects of my movement that constantly needed monitoring, palms out, heel out, elbow slightly bent, step forward etc. However in brief moments it all came together and I was able to appreciate the feelings of these movements as opposed to consciously monitoring them. We wanted this kind of experience when learning DroneChi so we devised a way of mapping the drone to the body to emulate this. After a few iterations we settled on the “mid point” method as seen below.

The drone only followed the midpoint (blue dot above) if it was within .2m of it. If it was outside of this range it would float away slowly from the participant. This may seem like a lot, but with little in the way of visual guidance (eg a laser pointer or an augmented display) a person can only rely on the proprioceptive feedback from their own body. We used the on board LED ring on the drone to let the person know at least when they are close, but that is all the help they got. As a result this takes a lot of concentration to get right!

In the end we were super happy with the final experience, in the study participants reported tuning into their bodies when using the drone, as well as experiencing a unique sort of relationship to the drone; not entirely like a pet and also like an extension of the body. We will be investigating both findings from the study through the design and testing of a new system on the Crazyflie. We see this work contributing to more intimate designs for human drone interactions as well as a being applicable to health contexts such as rehabilitation.

We have been traveling a lot this Autumn and have been talking to many Crazyflie users. It is always great to talk to our users and to get feedback about how what we make is being used.

The Swarm bundle

One particular subject that stood out was the Loco positioning system (LPS): the LSP seems to be used by a lot of people and we have gotten quite some feedback about it, some are about things that work but also some things that could be improved. This is interesting because we normally do not get that much feedback from people using the LPS.

We released the LPS about 2.5 years ago with Two-Way-Ranging single Crazyflie support, and it has been improved regularly since then among other things by adding 2 TDoA modes that supports multiple Crazyflies as well as releasing the Roadrunner board, a standalone LPS tag.

If you are using the LPS, it would be great to have some feedback about what you are using it for, what works, what does not work and (even better ;) if you have any improvement that can be pushed to the community. Do not hesitate to post in this blog post, on the forum or by posting issues or pull request in the LPS-node or Crazyflie-firmware github projects.

Hey there, my name is Zhouxin. I was born in the Netherlands and I still live there, but not for the upcoming four months since I am going to be an intern here, at Bitcraze. I am really looking forward to contribute and to be part of this team! In this blog post I will share with you my motivation of interning here and something about myself.

I am doing this internship as part of my studies at the TU Delft, an university in the Netherlands, but the main reason is that I like the technical challenges related to their product and the dynamic work environment. I am convinced that I can learn a lot here about the practical things like working in a tech company and also about the technical challenges when developing code for practical applications such as the Crazyflie 2.1.

As mentioned before this is part of my studies, at the moment I am studying for my Masters degree in Aerospace Engineering. In this degree there are profiles. I choose the profile Control and Simulation which is mainly focused on the control and navigation systems in aviation. This still sounds quite general, so I will give a few examples where you can think of. A graduate from this profile might work on the automatic pilot of an aircraft, on the simulators for pilot training, on air traffic management systems, or on autonomous micro air vehicles. The latter is something I am interested in and that’s one of the reasons I am doing my internship at Bitcraze.

As a child I was always intrigued by how birds can fly, this led to my desire to fly. I had tried flying several times by wearing a cape and by jumping of the couch while trying to optimize the airtime with flapping my cape. This gave me some adrenaline boost but I never was able to fly. Later I discovered that the only way for humans to fly is to become a pilot. This became my new dream. When I was about 8 years old I started to practice flying by controlling RC airplanes. This made me interested in electronics and technology which later translated into pursuing my degree in Aerospace Engineering.

Besides my academic interests I also occupy myself with other activities. About two years ago I took a gap year and went traveling in East Asia. There I have discovered my enjoyment of nature and exploring cultures. Also I love to snowboard in the French Alps during the winter holidays and to share these wintersport adventures with friends. When I am not traveling or abroad, I enthusiastically play field hockey or tennis.

We talked about it in a previous post, it is more than time to implement a higher abstraction layer for the Crazyflie firmware to make it easy to implement custom automations and programs on top of the flying platform. In this post we will try to explain the state of the art and where we are thinking of heading. This is mostly a request for comments and we are creating a github ticket to discuss about it.

DELFT – Zwerm Drones TU Delft. – FOTO GUUS SCHOONEWILLE

The out-of-tree build and P2P API presented in the previous post is a great start: it allows to make project on top of the Crazyflie firmware that can easily be maintained over time and to communicate directly between Crazyflie without having a PC in the loop. Though we have not completely solved or documented the API that can be called by the programs written on top of the Crazyflie, this is what the APP-layer is supposed to provide.

The current plan for the app layer is to make the same functionality that is available in the Python crazyflie lib API, accessible from within the Crazyflie firmware, using similar API calls. This way we get the possibility of prototyping functionality in python code on a remote machine, and when it is working, easily convert it to an app onboard. This is already implemented, in part, for the log and param API as well as for the low level parts of the commander. It has enabled us to write programs like the multiranger push demo and SGBA from Kimberly’s paper. The API is not yet documented properly and the function calls do not look like the ones on the python lib side at this time, but our intention is to converge the APIs over time.

We think that having the same level of functionality for Log, Param and Commander within a Crazyflie app, as in the python API, will already allow to implement a lot of onboard programs much more easily than has been possible until now. If there is anything else you think would be interesting to develop in this field, do not hesitate to drop a comment in this post or in the github issue.

This week we are exhibiting at IROS in Macau. We are running our fully autonomous demo based on the Lighthouse positioning technology and charging pads. We also have brought some prototypes to show, for instance the Crazyflie Bolt, the AI deck and the Active marker deck. You can read more about the demo at the IROS 2019 page.

We’d love to hear what you are working on, discuss issues, possibilities or new products. If you are at IROS, drop by our booth (B34) and say hi!

Lighthouse yaw

We have not only prepared for IROS, we have also been working on improving the lighthouse positioning system. Recently we added a (slightly hackish) solution for updating the yaw with data from the Lighthouse deck. This means that it is not necessary to start the Crazyflie facing the positive X direction when using the Lighthouse deck. The Crazyflie will understand its heading and act accordingly.

Two Crazyflies facing a random direction, take off and rotate to yaw=0.

We are also working on integrating the Lighthouse deck in a better way in the kalman filter. If everything goes according to plan, it will enable a Crazyflie to fly with only one base station, and be more robust when using two base stations.

For the last four years of doing my PhD at the TU Delft and the MAVlab, we were determined to figure out how to make a swarm/group of tiny quadcopters fly through and explore an unknown indoor environment. This was not easy, as many of the sub-challenges that needed to be solved first. However, we are happy to say that we were able to show a proof-of-concept in the latest Science Robotics issue! Here you can see the press release from the TU Delft for general information about the project.

Since we used the Crazyflie 2.0 to achieve this result, this blog-post we wanted to mostly highlight the technical side of the research, of the achievements and the challenges we had to face. Moreover, we will also explain the updated code which uses the new features of the Crazyflie Firmware as explained in the previous blogpost.

A swarm of drones exploring the environment, avoiding obstacles and each other. (Guus Schoonewille, TU Delft)

Hardware

In the paper, we presented a technique called Swarm Gradient Bug Algorithm (SGBA), which borrows (as the name suggests) navigational elements from the path planning technique called ‘Bug Algorithms’ (see this paper for an overview). The basic principle is that SGBA is a state-machine with several simple behavior presets such as ‘going to the goal’, ‘wall-following’ and ‘avoiding other Crazyflies’. Here in the bottom you can see all the modules were used. For the main experiments (on the left), the Crazyflie 2.0’s were equipped with the Multiranger and the Flowdeck (here we used the Flow deck v1). On the right you see the Crazyflies used for the application experiment, were we made an custom Multiranger deck (with four VL53L0x‘s) and a Hubsan Camera module. For both we used the Turnigy nanotech 300 mAh (1S 45-90C) LiPo battery, to increase the flight time to 7.5 min.

Hardware used in the experiments. Adapted from the science robotics paper.

Experiments

With this, we were able to have 6 Crazyflies explore an empty office floor in the faculty building of Aerospace engineering. They started out in the middle of the test environment and flew all in different preferred directions which they upheld by their internal estimated yaw angle. With the multi-rangers, they managed to detect walls in their, and followed its border until the way was clear again to follow their preferred direction. Based on their local odometry measurements with the flowdeck, the Crazyflies detected if they were flying in a loop, in order to get out of rooms or other situations.

A little before half way of their battery life, they would try to get back to their initial position, which they did by measuring the Received Signal Strength Intensity of the Crazyradio PA home beacon, which was located at their initial starting position. During wall-following, they measured the gradient of the RSSI, to determine in which directions it increases or decreases, to estimate the angle back the goal.

While they were navigating, they were also communicating with each-other by means of broadcasting messages. Based on those measurements of RSSI, they could sense other Crazyflies approaching, which they first of all used for collision avoidance (by letting the low priority CFs move out of the way of the high priority CFs). Second of all, during the initial exploration phase, they communicated their preferred direction as well, so that one of them can change its exploration behavior to not conflict with the other. This way, we tried to maximize the explored area by the Crazyflies.

One of those experiments with 6 Crazyflies can be seen in this video for better understanding:

We also showed an application experiment where 4 crazyflies with the camera modules searched for 2 dummies in the same environment.

Challenges

In order to get the results presented above, there were many challenges to overcome during the development phase. Here is a list that explains a couple of the elements that needed to work flawlessly:

  • Single CF robustness: We used the Flowdeck v1, for the ‘deadlock’ detection and the basic velocity control, which was challenging in the testing environment because of low lighting conditions and texture. Therefore the Crazyflies were flying at 0.5 meters in order to ensure robustness. The wall-following was performed solely using the Multiranger. This was tested out in many situations and was able to handle a lot of type of obstacles without any problem. However the limited FOV of the laser range finder can not detect all types of obstacles, for instance thin ones or irregular ones such as plants. Luckily these were not encountered in the environment the Crazyflies flew in, but to increase robustness, we will need to consider adding a camera to the navigational drive as well.
  • Communication base-station. SGBA by essence only needs one base-station Crazyradio PA, since all the behavior is completely on board. However, in order to show results in the paper, it was necessary for the CF to communicate information back, like odometry, state and such. As this was a two way communication (CFs needed RSSI to get back) each Crazyflie needed 1 base-station. Also, they all needed to be on different channels to avoid package collisions and RSSI accumulation.
  • Communication Peer to Peer. At development time, P2P didn’t exist yet, so we had to implement broadcast communication between the Crazyflies. Since the previous pointer required them to listen on different channels, the NRF had to be configured to send separate broadcast messages on all those channels as well. In order to time this properly, the home beacon had to sync the Crazyflies accordingly by sending out a timer. Even so, the avoidance maneuvers were done very conservatively to try to prevent inter-drone collisions.

Many of the issues, especially the communication challenges, will be solved with the updated code implementation as explained in the next section.

Updated code

The firmware that the Crazyflies used to fly in the experiments showed in the paper, can all be found in this public repository. However, the code is based quite an old version the current Crazyflie firmware, as it was forked almost a year ago. The implementation of the SGBA state machine and the P2P broadcasting were not generic enough to integrate this back to the development cycle, therefore the current code is only suitable for the old Crazyflie 2.0.

Therefore, we developed two major changes in the latest firmware which will make it much easier for me (and other ideas as well we hope!) to implement SGBA and the P2P communication in a way that should be compatible with any version of the firmware (and hardware) from here and on. We implemented SGBA as an app-layer and also handled all the broadcast messaging directly from this layer as well. Please check out this Github repository with this new app layer implementation of SGBA.

Over time the scope of Crazyflie has changed a lot. At first, Crazyflie was “just flying” with the only possible control was attitude (roll, pitch, yaw) and thrust setpoint sent from the Radio. Soon after, autonomous flight was investigated, first by implementing position controller outside Crazyflie and then, over time, moving position control on-board and sending position or trajectory setpoint to the Crazyflie. Now that the Crazyflie has good position control, the next step is to implement autonomous behavior and until now the most practical way is to do this from code running in an external computer. Similarly to what happened in the history of position control (first off-board and now onboard), it needs to be possible implement autonomous behavior in the Crazyflie itself. This blog post is about two newly implemented capabilities that will allow to implement automation in the Crazyflie firmware in an easy and maintainable way, namely the ‘App layer’ and the P2P communication.

App layer

The “App layer” is a term we have been using internally in Bitcraze to describe a set of functionalities that would allow to implement code in the Crazyflie. This includes the infrastructure to compile and maintain external code running in the Crazyflie as well as a set of API to control flight and behavior from C code rather than from radio communication.

Last week we implemented the first step of the App layer: the infrastructure part. It is now possible to build the Crazyflie firmware out-of-tree. This means that it is now possible, from a project, to point to the Crazyflie firmware folder and to compile a firmware from the project folder without touching the Crazyflie firmware folder. Practically it allows to create a git-repos implementing custom firmware code that has Crazyflie firmware as a sub-repos. This makes the maintenance of custom firmware code much easier than maintaining a branch of the Crazflie firmware as previously required.

A second piece that has been implemented is the app entry-point. It allows to start running code by just creating an “appMain()” function. The function will be called from a dedicated FreeRTOS task after the Crazyflie has initialized and started. This should make it much easier to get started.

For an example, we have extracted the multiranger push demo into a standalone git repos. This demonstrate the implementation of autonomous behavior using these new infrastructures.

Peer to Peer communication

The Crazyflie has been used for many research related to swarming, some examples are the crazyswarm project or the work done by Carnegie Mellon University. However, it is now time to turn it up a notch. On the forum and on the Github repository, there has been several request of enabling direct peer to peer communication to the Crazyflies. Now we finally found time to work on it and implement some basic functionality on the NRF and STM side of the firmware.

Currently, it is possible to send and receive a P2P packet in broadcast mode from the STM directly (see how to do this in the documentation). This enables data to be send from one Crazyflie to another with a maximum data size of 60 bytes. We were able to stress-test this with our test rig, by sending broadcast messages in a round-robin-kind of fashion, where the broadcast message was transferred through 10 Crazyflies in 10-20 ms. Even-though the current implementation is for now very minimal, we were able to fix some existing issues in the radiolink framework.

We will not stop there, as we are hoping to implement a communication system similar to how the CTRP protocol has been implemented. We are getting a lot of help by our active community members, so check out this github issue to be up-to-date with the current discussion.