It’s time for a new compilation video about how the Crazyflie is used in research ! The last one featured already a lot of awesome work, but a lot happened since then, both in research and at Bitcraze.
As usual, the hardest about making those videos is choosing the works we want to feature – if every cool video of the Crazyflie was in there, it would last for hours! So it’s just a selection of the most videogenic projects we’ve seen. You can find a more extensive list of our products used in research here.
We’ve seen a lot of projects that used the modularity of the Crazyflie to create awesome new features, like a catenary robot, some wall tracking or having it land upside down. The Crazyflie board was even made into a revolving wing drone. New sensors were used, to sniff out gas leaks (the Sniffy bug as seen in this blogpost), or to allow autonomous navigation. Swarms are also a research topic where we see a lot of the Crazyflie, this time for collision avoidance, or path planning. We also see more and more of simulators, which are used for huge swarms or physics tests.
Once again, we were surprised and awed by all the awesome things that the community did with the Crazyflie. Hopefully, this will inspire others to think of new things to do as well. We hope that we can continue with helping you to make your ideas fly, and don’t hesitate to share with us the awesome projects you’re working on!
Here is a list of all the research that has been included in the video:
This year, the traditional Christmas video was overtaken by a big project that we had at the end of November: creating a test show with the help of CollMot.
First, a little context: CollMot is a show company based in Hungary that we’ve partnered with on a regular basis, having brainstorms about show drones and discussing possibilities for indoor drones shows in general. They developed Skybrush, an open- source software for controlling swarms. We have wanted to work with them for a long time.
So, when the opportunity came to rent an old train hall that we visit often (because it’s right next to our office and hosts good street food), we jumped on it. The place itself is huge, with massive pillars, pits for train maintenance, high ceiling with metal beams and a really funky industrial look. The idea was to do a technology test and try out if we could scale up the Loco positioning system to a larger space. This was also the perfect time to invite the guys at CollMot for some exploring and hacking.
The train hall
The Loco system
We added the TDoA3 Long Range mode recently and we had done experiments in our test-lab that indicate that the Loco Positioning systems should work in a bigger space with up to 20 anchors, but we had not actually tested it in a larger space.
The maximum radio range between anchors is probably up to around 40 meters in the Long Range mode, but we decided to set up a system that was only around 25×25 meters, with 9 anchors in the ceiling and 9 anchors on the floor placed in 3 by 3 matrices. The reason we did not go bigger is that the height of the space is around 7-8 meters and we did not want to end up with a system that is too wide in relation to the height, this would reduce Z accuracy. This setup gave us 4 cells of 12x12x7 meters which should be OK.
Finding a solution to get the anchors up to the 8 meters ceiling – and getting them down easily was also a headscratcher, but with some ingenuity (and meat hooks!) we managed to create a system. We only had the hall for 2 days before filming at night, and setting up the anchors on the ceiling took a big chunk out of the first day.
Drone hardware
We used 20 Crazyflie 2.1 equipped with the Loco deck, LED-rings, thrust upgrade kit and tattu 350 mAh batteries. We soldered the pin-headers to the Loco decks for better rigidity but also because it adds a bit more “height-adjust-ability” for the 350 mAh battery which is a bit thicker then the stock battery. To make the LED-ring more visible from the sides we created a diffuser that we 3D-printed in white PLA. The full assembly weighed in at 41 grams. With the LED-ring lit up almost all of the time we concluded that the show-flight should not be longer than 3-4 minutes (with some flight time margin).
The show
CollMot, on their end, designed the whole show using Skyscript and Skybrush Studio. The aim was to have relatively simple and easily changeable formations to be able to test a lot of different things, like the large area, speed, or synchronicity. They joined us on the second day to implement the choreography, and share their knowledge about drone shows.
We got some time afterwards to discuss a lot of things, and enjoy some nice beers and dinner after a job well done. We even had time on the third day, before dismantling everything, to experiment a lot more in this huge space and got some interesting data.
What did we learn?
Initially we had problems with positioning, we got outliers and lost tracking sometimes. Finally we managed to trace the problems to the outlier filter. The filter was written a long time ago and the current implementation was optimized for 8 anchors in a smaller space, which did not really work in this setup. After some tweaking the problem was solved, but we need to improve the filter for generic support of different system setups in the future.
Another problem that was observed is that the Z-estimate tends to get an offset that “sticks” and it is not corrected over time. We do not really understand this and will require more investigations.
The outlier filer was the only major problem that we had to solve, otherwise the Loco system mainly performed as expected and we are very happy with the result! The changes in the firmware is available in this, slightly hackish branch.
We also spent some time testing maximum velocities. For the horizontal velocities the Crazyflies started loosing positioning over 3 m/s. They could probably go much faster but the outlier filter started having problems at higher speeds. Also the overshoot became larger the faster we flew which most likely could be solved with better controller tuning. For the vertical velocity 3 m/s was also the maximum, limited by the deceleration when coming downwards. Some improvements can be made here.
Conclusion is that many things works really well but there are still some optimizations and improvements that could be made to make it even more robust and accurate.
The video
But, enough talking, here is the never-seen-before New Year’s Eve video
And if you’re curious to see behind the scenes
Thanks to CollMot for their presence and valuable expertise, and InDiscourse for arranging the video!
And with the final blogpost of 2022 and this amazing video, it’s time to wish you a nice New Year’s Eve and a happy beginning of 2023!
Tiny quadcopters like the Crazyflie can be operated in narrow, cluttered environments and in proximity to humans, making them the perfect candidate for search-and-rescue operations, monitoring of crop in a greenhouse, or performing inspections where other flying robots cannot reach. All these applications benefit from autonomy, allowing deployment without proximity to a base station or human operator and permitting swarming behavior.
Achieving autonomous navigation on nano quadcopters is challenging given the highly constrained payload and computational power of the platform. Most attention has been given to monocular solutions; the camera is a lightweight and energy-efficient passive sensor that captures rich information of the environment. One of the most important monocular visual cues is optical flow, which has been exploited on MAVs with higher payload for obstacle avoidance [1], depth estimation [2] and several bio-inspired methods for autonomous navigation [3–7].
Optical flow describes the apparent visual variations caused by relative motion between an observer and their surroundings. This rich visual cue contains tangled information of velocity and depth. However, calculating optical flow is expensive. The field of optical flow estimation is and has been for a couple of years dominated by convolutional neutral networks (CNNs). Despite efforts to find architectures of reduced size and latency [8-10], these methods are still highly computationally expensive, running at several to tens of FPS on modern desktop GPUs and requiring millions of parameters to run, rendering them incompatible with edge hardware.
To this end, we present “NanoFlowNet: Real-Time Dense Optical Flow on a Nano Quadcopter”, submitted to an international robotics conference, which introduces NanoFlowNet, a CNN architecture designed for real-time, fully on-board, dense optical flow estimation on the AI-deck.
CNN architecture
We adopt semantic segmentation CNN STDC-Seg [11] and modify it for optical flow estimation. The resulting CNN architecture may be considered “real-time” on desktop hardware, for deployment on edge devices such as a nano quadcopter the net must be significantly shrunk. We improve the latency of the architecture in three ways.
First, we redesign the key convolutional modules of the architecture, the Short-Term Dense Concatenate (STDC) module. By reordering the operations within the strided variant of the module, we save, depending on the location of the module within the architecture, from over 10% to over 50% of the MAC operations per module, while increasing the number of output filters with large receptive field size. A large receptive field size is desirable for optical flow estimation.
Second, inspired by MobileNets [12], we globally replace ‘regular’ convolutions with depthwise separable convolutions. Depthwise separable convolutions factorize a convolution into a depthwise and pointwise convolution, effectively reducing the calculational expense at a cost in representational capacity.
Third, we reduce the input dimensionality. We train and infer network on grayscale input images, reducing the required on-board memory for storing images by a factor 2/3. Any memory saved on the AI-deck’s L2 memory can be handed to AutoTiler for storing the CNN architecture, speeding up the on-board execution. Requiring more of a speed-up, we run the CNN on-board at a reduced input resolution of 160×112 pixels. Besides the speed-up through saved L2, reducing the input resolution makes all operations throughout the network cheaper. We downscale training data to closely match the target resolution. Both these changes come at a loss of input information. We will miss out on small objects and small displacements that are not captured by the resolution.
To give some intuition of the available memory: Estimating optical flow requires two input images. Storing two color input images at full resolution requires (2 x 324x324x3=) 630 kB. The AI-deck has 512 kB of L2 memory available.
Motion boundary detail guidance
Inspired by STDC-Seg, we guide the training of optical flow with a train-time-only auxiliary task to promote the encoding of spatial information in the early layers. Specifically, we introduce a motion boundary prediction task to the net. The motion boundary ground truth can be found in the optical flow datasets. This improves performance by 0.5 EPE on the MPI Sintel clean (train) benchmark, at zero cost to inference latency.
Performance on MPI Sintel
Given the scaling and conversion to grayscale of input data, our network is not directly comparable with results reported by other works. For comparison, we retrain one of the fastest networks in literature, Flownet2-s [13], on the same data. Given the reduction in resolution, we drop the deepest two layers to maintain a reasonable feature size. We name the model Flownet2-xs.
We benchmark the performance of the architecture on the optical flow dataset MPI Sintel. NanoFlowNet performs better than FlowNet2-xs, despite using less than 10% of the parameters. NanoFlowNet achieves 5.57 FPS on the AI-deck. FlowNet2-xs does not fit on the AI-deck due to the network size. To put the achieved latency of NanoFlowNet in perspective, we execute FlowNet2-xs’ first two convolutions and the final prediction layer on the GAP8. The three-layer architecture achieves 4.96 FPS, which is slower than running the entire NanoFlowNet. On a laptop GPU, the two architectures accomplish similar latency.
Method
MPI Sintel (train) [EPE]
Frame rate [FPS]
Parameters
Clean
Final
GPU
GAP8
FlowNet2-xs
9.054
9.458
150
–
1,978,250
NanoFlowNet
7.122
7.979
141
5.57
170,881
Performance on MPI Sintel (train subset). (Average) end-to-end Point Error (EPE) describes how far off the estimated flow vectors are on average, lower is better.
Input frame IGround truth optical flowNanoFlowNet (ours)FlowNet2-xsInput frame IGround truth optical flowNanoFlowNet (ours)FlowNet2-xsQualitative comparison of optical flow estimates by NanoFlowNet(-s) and FlowNet2-xs on MPI Sintel (train) clean pass. NanoFlowNet and NanoFlowNet-s pick up on smaller moving objects, such as the person in the bottom row.
Obstacle avoidance implementation
We demonstrate the effectiveness of NanoFlowNet by implementing it in a simple, proof-of-concept obstacle avoidance application on an AI-deck equipped Crazyflie. We let the quadcopter fly forward at constant velocity and implement the horizontal balance strategy [14], [15], where the quadcopter balances the optical flow in the left and right half plane by yawing.
We equip a Crazyflie with the Flow deck for positioning only. The total flight platform weighs 34 grams.
We augment the balance strategy by implementing active oscillations (a cyclic up-down movement), resulting in additional optical flow generated across the field of view. This is particularly helpful for avoiding obstacles in the direction of horizontal travel, since no optical flow is generated at the focus of expansion.
The obstacle avoidance implementation is demonstrated in an open and a cluttered environment in ‘the Cyber Zoo’, an indoor flight arena at the faculty of Aerospace Engineering at the Delft University of Technology. The control algorithm is most robust in the open environment, with the quadcopter managing to drain a full battery without crashing. In the cluttered environment, performance is more variable. Especially on occasions where obstacles are close to one another, the quadcopter tends to avoid the first obstacle successfully, only to turn straight into the second and crash into it. Adding a head-on collision detection based on FOE detection and divergence estimation (e.g., [7]) should help avoid obstacles in these cases.
Successful run in a cluttered environment in the Cyber Zoo. The Crazyflie manages to avoid collision until the battery is drained.
All in all, we consider the result a successful demonstration of the optical flow CNN. In future work, we expect to see applications that take more advantage of the resolution of the flow information.
Citation
Bouwmeester, R. J., Paredes-Vallés, F., De Croon, G. C. H. E. (2022). NanoFlowNet: Real-time Dense Optical Flow on a Nano Quadcopter. arXiv. https://doi.org/10.48550/arXiv.2209.06918
References
[1] Gao, P., Zhang, D., Fang, Q., & Jin, S. (2017). Obstacle avoidance for micro quadrotor based on optical flow. Proceedings of the 29th Chinese Control and Decision Conference, CCDC 2017, 4033–4037. https://doi.org/10.1109/CCDC.2017.7979206
[2] Sanket, N. J., Singh, C. D., Ganguly, K., Fermuller, C., & Aloimonos, Y. (2018). GapFlyt: Active vision based minimalist structure-less gap detection for quadrotor flight. IEEE Robotics and Automation Letters, 3(4), 2799–2806. https://doi.org/10.1109/LRA.2018.2843445
[3] Conroy, J., Gremillion, G., Ranganathan, B., & Humbert, J. S. (2009). Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Autonomous Robots, 27(3), 189–198. https://doi.org/10.1007/s10514-009-9140-0
[4] Zingg, S., Scaramuzza, D., Weiss, S., & Siegwart, R. (2010). MAV navigation through indoor corridors using optical flow. Proceedings – IEEE International Conference on Robotics and Automation, 3361–3368. https://doi.org/10.1109/ROBOT.2010.5509777
[5] De Croon, G. C. H. E. (2016). Monocular distance estimation with optical flow maneuvers and efference copies: A stability-based strategy. Bioinspiration and Biomimetics, 11(1). https://doi.org/10.1088/1748-3190/11/1/016004
[6] Serres, J. R., & Ruffier, F. (2017). Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure and Development, 46(5), 703–717. https://doi.org/10.1016/j.asd.2017.06.003
[7] De Croon, G. C. H. E., De Wagter, C., & Seidl, T. (2021). Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nature Machine Intelligence, 3(1), 33–41. https://doi.org/10.1038/s42256-020-00279-7
[8] Ranjan, A., & Black, M. J. (2017). Optical flow estimation using a spatial pyramid network. Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 2720–2729. https://doi.org/10.1109/CVPR.2017.291
[9] Hui, T. W., Tang, X., & Loy, C. C. (2018). LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8981–8989. https://doi.org/10.1109/CVPR.2018.00936
[10] Sun, D., Yang, X., Liu, M. Y., & Kautz, J. (2017). PWC-Net: CNNs for Optical Flow Using Pyramid, Warping, and Cost Volume. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 8934–8943. https://doi.org/10.1109/CVPR.2018.00931
[11] Fan, M., Lai, S., Huang, J., Wei, X., Chai, Z., Luo, J., & Wei, X. (2021). Rethinking BiSeNet For Real-time Semantic Segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 9711–9720. https://doi.org/10.1109/CVPR46437.2021.00959
[12] Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. In arXiv. arXiv. http://arxiv.org/abs/1704.04861
[13] Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., & Brox, T. (2017). FlowNet 2.0: Evolution of optical flow estimation with deep networks.Proceedings – 30th IEEE Conference on Computer Vision and Pattern Recognition, 1647–1655. https://doi.org/10.1109/CVPR.2017.179
[14] Souhila, K., & Karim, A. (2007). Optical flow based robot obstacle avoidance. International Journal of Advanced Robotic Systems, 4(1), 2. https://doi.org/10.5772/5715
[15] Cho, G., Kim, J., & Oh, H. (2019). Vision-based obstacle avoidance strategies for MAVs using optical flows in 3-D textured environments. Sensors, 19(11), 2523. https://doi.org/10.3390/s19112523
Christmas is just around the corner, and it’s time for the traditionnal Christmas video! This year, we wanted to use the AI deck as we’ve been working hard on this deck for some time now. Showcasing its new feature in this festive video seemed the best idea.
Santa this year needed help to find and get the presents delivered, so he asked for help from Bitcraze! Let’s see how it played out:
I’m sure you’re wondering how we managed to set this up, so let’s discuss how we did it!
Picking up the packages
The goal was to pick up some packages and place them in the sleigh, and it all worked out pretty well. All the flying in the video is scripted using the python lib and positioning is done using Qualisys’ motion capture system with Active marker decks. The trajectories are hard coded and with some careful adjustments we managed to lift and fly the 4 crazyflies attached to the same present, even though it was a bit wobbly from time to time. Getting the present into the sleigh was not as easy, and we might have taken some short cuts here as well as when attaching/removing lines.
Sometimes detangling the wires was a teamwork
Picking up the second package needed some precision, and it went incredibly well, on the first try! The present was very light and needed someone to hold it, to prevent it from moving when the Crazyflie approached.
Getting the sleigh off
Our first tries included 5 strings attached to the sleigh, but it was difficult to get the right tension at the right time to have the UAVs actually pull the weight. Here came the “rubber band solution”: we just attached all the strings to a rubber band, that was itself attached to the sleight. That way, the tension could get even when all the Crazyflies were in the air and ready to pull the sleight.
The rubber band
The AI deck camera/streamer
When we started this project, the intention was to run a neural network in the AI-deck to identify or classify the presents. We did not manage to get to a point where we had something that actually added value to the story, so we settled for just streaming the video from the AI-deck camera on the scouting drone instead.
The AI-deck example with color camera viewer is still under development, but if you want to give it a try you can take a look at the readme in the github repo.
Bloopers
And finally, as an added bonus, if you ever wonder how many tries it takes to make 5 Crazyflies pull a sleigh, here is a little behind-the-scenes video too!
The last week was epic. We had 3 days of our online conference, the BAM days – I’m sure you’ve heard of them by now.
We are really happy with how everything went down. During those 3 days, 142 people attended, which is a highest number than we could have expected. The Welkom platform we used was stellar, allowing us to use Mibo rooms for very fruitful discussions after each talk.
Quiz and community Q&A
We took the opportunity to talk to our community, which is something we didn’t have the opportunity to do in a long time. Your insights and feedback were greatly appreciated and we have a lot to think about in the next coming weeks on how to best use all the remarks we got.
We also had a short quiz about Bitcraze, and we were quite impressed with how you performed ! And interesting to note that the hardest question for you was how many decks we sell (it’s 16, if you want to cheat on our hypothetical next quiz)
As I’ve mentioned, after each event we gathered in Mibo rooms. Even though attendance there was not as high as we would have liked, we still got quality time with community members, speakers, collaborators, even first-timers that were interested in the Crazyflie. We really love this platform, making us feel almost like meeting in real life. We even had some karaoke in Mibo during the closing party (which MAY be a good excuse to end the day for those who listened)
Content
Our external speakers presented a lot of interesting work. It was a great pleasure and honor to welcome every one of them as they explained their latest work. I have to admit that it’s rewarding to see such smart people doing awesome and cool research with our products.
We did our share too, with workshops and demos. Kristoffer’s autonomous demo using distributed consensus required a lot of work but worked perfectly in the end. Here is a small excerpt:
What now?
Now, we’re feeling as everyone is feeling the day after a party: exhausted, happy, and wondering what to do next. Hopefully we have some plans for that !
If you missed the conference, we created a Youtube playlist where you can watch everything that you missed. During the next few days, we’ll update the event page with BAM’s presentations too so you will have the opportunity to catch up. Some of our workshops will also turn into tutorials or documentations of some kind, but we’re still just cleaning up.
We are so happy with how everything went that we are already thinking about a future BAM. This one was exceptional, of course, since it was at first to celebrate our 10 year anniversary (and I have to admit that we’re all a little bit tired after 3 intense days), so we’re not going to be able to top that. But we are considering making BAM a fixed point in our agenda (and yours, let’s hope). We don’t how, we don’t know when, but one thing is sure: BAM is just beginning.
Kimberly on a different continent
On a totally different note, Kimberly is flying to the US this week: if any of you America-based wants to grab the occasion to have a more time-zone appropriate conversation with one of us, you will have a few weeks to make it possible!
It’s that time of the year again ! As the days get darker and darker here in Sweden, we’re happy to getting some time off to share some warmth with our families.
And to kick off the holiday season, we prepared a little treat for you ! We enjoyed making a Christmas video that tested how we could use the Crazyflie at home. Since we’re not at the office anymore, we decided to fly in our homes and this video shows the different ways to do so. First, take a look at what we’ve done:
Now let’s dig into the different techniques we used.
Tobias decided to fly the Bolt manually. His first choice was to land in the Christmas sock, but that was too hard, thereof the hard landing in top of the tree. We were not sure who would survive: the tree or the Bolt!
Kimberly installed two base stations V2’s and after setting up, determined some way points by holding the Crazyflie in her hand. Then she generated a trajectory with the uav_trajectories project (like in the hyper demo). Then she used the cflib to upload this trajectory and make the crazyflie fly all the way to the basket. Her two cats could have looked more impressed, though!
Using trials and errors, Barbara used the Flowdeck, the motion commander, and a broken measuring tape to calibrate the Crazyflie’s path next to the tree.
Arnaud realized that, with all the autonomous work, we hardly fly the Crazyflie manually anymore. So he flew the Crazyflie manually. It required a bit more training that expected, but Crazyflie is really a fun (and safe!) quad to fly.
Marcus used two Lighthouse V2 base stations together with the Lighthouse deck and LED-ring deck. For the flying, he used the high level commander. The original plan was to fly around his gingerbread house, but unfortunately it was demolished before he got the chance (by some hungry elves surely!)
Kristoffer made his own tree ornament with the drone, which turned out to be a nice addition to a Christmas tree !
It was a fun way to use our own product, and to show off our decorated houses.
I hope you enjoy watching this video as much as we enjoyed making it.
We are staying open during the Holiday season but on a limited capacity: we still ship your orders, and will keep an eye on our emails and the forum, but things will get a bit slower here.
We wish you happy holidays and safe moments together with your loved ones.
Our team’s Crazyflie quadcopter was equipped with Bitcraze’s Optic Flow deck and Multi-ranger deck. A BH-1750 light intensity sensor was soldered to a Bitcraze Prototype deck to complete our hardware.
The Crazyflie 2.1 was the perfect robotics platform for an introduction to autonomous robotics at the University of Washington winter quarter 2020. Our Bio-inspired Robotics graduate course completed a series of Crazyflie projects throughout the 10 weeks that built our skills in:
Python
Robot Operating System (ROS)
assembling custom sensors
writing new drivers
designing and testing control algorithms
trouble shooting and independent learning
The course was offered by UW Mechanical Engineering’s Autonomous Insect Robotics Laboratory, headed by Dr. Sawyer B. Fuller. The course was supported by PhD candidate Melanie Anderson, who has done fantastic research with her Crazyflie-based Smellicopter. The final project was an opportunity to turn a Crazyflie quadcopter into a bio-inspired autonomous robot. Our three person team of UW robotics grad students included Nishant Elkunchwar, Krishna Balasubramanian, and Jessica Noe.
Light Seeking Run-and-Tumble Algorithm Inspired by Bacterial Chemotaxis
The goal for our team’s Crazyflie was to seek and identify a light source. We chose a run-and-tumble algorithm inspired by bacterial chemotaxis. For a quick explanation of bacterial chemotaxis, please see Andrea Schmidt’s explanation of chemotaxis on Dr. Mehran Kardar’s MIT teaching page. She provides a helpful animation here.
In both bacterial chemotaxis and our run-and-tumble algorithm, there is a body (the bacteria or the robot) that can:
move under its own power.
detect the magnitude of something in the environment (e.g. chemical put off by a food source or light intensity).
determine whether the magnitude is greater or less than it was a short time before.
This method works best if the environment contains a strong gradient from low concentration to high concentration that the bacteria or robot can follow towards a high concentration source.
The details of the run-and-tumble algorithm are shown in a finite state machine diagram below. The simple summary is that the Crazyflie takes off, begins moving forward, and if the light intensity is getting larger it continues to “Run” in the same direction. If the light intensity is getting smaller, it will “Tumble” to a random direction. Additional layers of decision making are included to determine if the Crazyflie must “Avoid Obstacle”, or if the source has been reached and the Crazyflie quadcopter should “Stop”.
The run-and-tumble algorithm represented as a finite state machine.
Crazyflie Hardware
To implement the run-and-tumble algorithm autonomously on the Crazyflie, we needed a Crazyflie quadcopter and these additional sensors:
Bitcraze Prototype deck with BH-1750 light intensity sensor
The Optic Flow deck was a key sensor in achieving autonomous flight. This sensor package determines the Crazyflie’s height above the surface and tracks its horizontal motion from the starting position along the x-direction and y-direction coordinates. With the Optic Flow installed, the Crazyflie is capable of autonomously maintaining a constant height above the surface. It can also move forward, back, left, and right a set distance or at a set speed. Several other pre-programmed movement behaviors can also be chosen. This Bitcraze blog post has more information on how the Flow deck works and this post by Chuan-en Lin on Nanonets.com provides more in-depth information if you would like to read more.
The Bitcraze Multi-ranger deck provided the sensor data for obstacle avoidance. The Multi-ranger detects the distance from the Crazyflie to the nearest object in five directions: forward, backward, right, left, and above. Our threshold to trigger the “Avoid Obstacle” behavior is detecting an obstacle within 0.5 meters of the Crazyflie quadcopter.
The Prototype deck was a quick, simple way to connect the BH-1750 light intensity sensor to the pins of the Crazyflie to physically integrate the sensor with the quadcopter hardware. This diagram shows how the header positions connect to the rows of pads in the center of the deck. We soldered a header into the center of the deck, then soldered connections between the pads to form continuous connections from our header pin to the correct Crazyflie header pin on the left or right edges of the Prototype deck. The Bitcraze Wiki provides a pin map for the Crazyflie quadcopter and information about the power supply pins. A nice overview of the BH-1750 sensor is found on Components101.com, this shows the pin map and the 4.7 kOhm pull-up resistor that needs to be placed on the I2C line.
It was easy to connect the decks to the Crazyflie because Bitcraze clearly marks “Front”, “Up” and “Down” to help you orient each deck relative to the Crazyflie. See the Bitcraze documentation on expansion decks for more details. Once the decks are properly attached, the Crazyflie can automatically detect that the Flow and Multi-Ranger decks are installed, and all of the built-in functions related to these decks are immediately available for use without reflashing the Crazyflie with updated firmware. (We appreciated this awesome feature!)
Crazyflie Firmware and ROS Control Software
Bitcraze provides a downloadable virtual machine (VM) to help users quickly start developing their own code for the Crazyflie. Our team used a VM that was modified by UW graduate students Melanie Anderson and Joseph Sullivan to make it easier to write ROS control code in the Python coding language to control one or more Crazyflie quadcopters. This was helpful to our team because we were all familiar with Python from previous work. The standard Bitcraze VM is available on Bitcraze’s Github page. The Modified VM constructed by Joseph and Melanie is available through Melanie’s Github page. Available on Joseph’s Github page is the “rospy_crazyflie” code that can be combined with existing installs of ROS and Bitcraze’s Python API if users do not want to use the VM options.
“crazyflie-firmware” – a set of files written in C that can be uploaded to the Crazyflie quadcopter to overwrite the default firmware
In the Bitcraze VM, this folder is located at “/home/bitcraze/projects/crazyflie-firmware”
In the Modified VM, this folder is located at “Home/crazyflie-firmware”
“crazyflie-lib-python” (in the Bitcraze VM) or “rospy_crazyflie” (in the Modified VM) – a set of ROS files that allows high-level control of the quadcopter’s actions
In the Bitcraze VM, “crazyflie-lib-python” is located at “/home/bitcraze/projects/crazyflie-lib-python”
In the Modified VM, navigate to “Home/catkin_ws/src” which contains two main sets of files:
“Home/catkin_ws/src/crazyflie-lib-python” – a copy of the Bitcraze “crazyflie-lib-python”
“Home/catkin_ws/src/rospy_crazyflie” – the modified version of “crazyflie-lib-python” that includes additional ROS and Python functionality, and example scripts created by Joseph and Melanie
In the Modified VM, we edited the “crazyflie-firmware” files to include code for our light intensity sensor, and we edited “rospy-crazyflie” to add functions to the ROS software that runs on the Crazyflie. Having the VM environment saved our team a huge amount of time and frustration – we did not have to download a basic virtual machine, then update software versions, find libraries, and track down fixes for incompatible software. We could just start writing new code for the Crazyflie.
The Modified VM for the Crazyflie takes advantage of the Robot Operating System (ROS) architecture. The example script provided within the Modified VM helped us quickly become familiar with basic ROS concepts like nodes, topics, message types, publishing, and subscribing. We were able to understand and write our own nodes that published information to different topics and write nodes that subscribed to the topics to receive and use the information to control the Crazyflie.
A major challenge of our project was writing a new driver that could be added to the Crazyflie firmware to tell the Crazyflie system that we had connected an additional sensor to the Crazyflie’s I2C bus. Our team referenced open-source Arduino drivers to understand how the BH-1750 connects to an Arduino I2C bus. We also looked at the open-source drivers written by Bitcraze for the Multi-ranger deck to see how it connects to the Crazyflie I2C bus. By looking at all of these open-source examples and studying how to use I2C communication protocols, our team member Nishant Elkunchwar was able to write a driver that allowed the Crazyflie to recognize the BH-1750 signal and convert it to a sensor value to be used within the Crazyflie’s ROS-based operating system. That driver is available on Nishant’s Github. The driver needed to be placed into the appropriate folder: “…\crazyflie-firmware\src\deck\drivers\src”.
The second change to the crazyflie-firmware is to add a “config.mk” file in the folder “…\crazyflie-firmware\tools\make”. Information about the “config.mk” file is available in the Bitcraze documentation on configuring the build.
The final change to the crazyflie-firmware is to update the make file “MakeFile” in the location “…\crazyflie-firmware”. The “MakeFile” changes include adding one line to the section “# Deck API” and two lines to the section “# Decks”. Information about compiling the MakeFile is available in the Bitcraze documentation about flashing the quadcopter.
Making additions to the ROS control architecture
The ROS control architecture includes messages. We needed to define 3 new types of messages for our new ROS control files. In the folder “…\catkin_ws\src\rospy_crazyflie\msg\msg” we added one file for each new message type. We also updated “CMakeLists.txt” to add the name of our message files in the section “add_message_files( )”.
The second part of our ROS control was a set of scripts written in Python. These included our run-and-tumble algorithm control code, publisher scripts, and a plotter script. These are all available in the project’s Github.
Characterizing the Light Sensor
At this point, the light intensity sensor was successfully integrated into the Crazyflie quadcopter. The new code was written and the Crazyflie quadcopter was reflashed with new firmware. We had completed our initial trouble shooting and the next step was to characterize the light intensity in our experimental setup.
Experimental setup for light intensity characterization.
This characterization was done by flying the Crazyflie at a fixed distance above the floor in tightly spaced rows along the x and y horizontal directions. The resulting plot (below) shows that the light intensity increases exponentially as the Crazyflie moves towards the light source.
The light characterization allowed us to determine an intensity threshold that will only happen near the light source. If this threshold is met, the algorithm’s “Stop” action is triggered, and the Crazyflie lands.
Light intensity (units of lux) was experimentally characterized by piloting the Crazyflie in a linear pattern at a constant height above the ground. The resulting plot shows that light intensity is characterized by an exponential roll off in both the x and y directions.
Testing the Run-and-Tumble Algorithm
With the light intensity characterization complete, we were able to test and revise our run-and-tumble algorithm. At each loop of the algorithm, one of the four actions is chosen: “Run”, “Tumble”, “Avoid Obstacle”, or “Stop”. The plot below shows a typical path with the action that was taken at each loop iteration.
Flight Tests of the Run-and-Tumble Algorithm
In final testing, we performed 4 trial runs with 100% success locating the light source. Our test area was approximately 100 square feet, included 1 light source, and 2 obstacles. The average search time was 1:41 seconds.
The “Avoid Obstacle” and “Run” behavior are demonstrated in the above video clip (1.5x actual speed).
The “Run” and “Tumble” actions are demonstrated in the above video clip (2x actual speed). At the end, the “Stop” action is demonstrated when the light intensity reaches the threshold value of 800 lux, indicating that the Crazyflie has found the light source and should land.
Lessons Learned
This was one of the best courses I’ve taken at the University of Washington. It was one of the first classes where a robot could be incorporated, and playing with the Crazyflie was pure fun. Another positive aspect was that the course had the feel of a boot camp for learning how to build, control, test, and improve autonomous robots. This was only possible because Bitcraze’s small, indoor quadcopter with optic flow capability made it possible to safely operate several quadcopters simultaneously in our small classroom as we learned.
This development project was really interesting (aka difficult…) and we went down a few rabbit holes as we tried to level up our knowledge and skills. Our prior experience with Python helped us read the custom example scripts provided in our course for the ROS control program, but we had quite a bit to learn about the ROS architecture before we could write our own control scripts.
Nishant made an extensive study of I2C protocols as he wrote the new driver for the BH-1750 sensor. One of the biggest lessons I learned in this project was that writing drivers to integrate a sensor to a microcontroller is hard. By contrast, using the Bitcraze decks was so easy it almost felt like cheating. (In the nicest way!)
On the hardware side, the one big problem we encountered during development was accidentally breaking the 0.5 mm headers on the Crazyflie quadcopter and the decks. The male headers were not long enough to extend from the Flow deck all the way up through the Prototype deck at the top, so we tried to solder extensions onto the pins. Unfortunately, I did not check the Bitcraze pin width and I just soldered on the pins we all had in our tool kits: the 0.1 inch (2.54 mm) wide pins that we use with our Arduinos and BeagleBones. These too-large-pins damaged the female headers on the decks, and we lost connectivity on those pins. Fortunately, we were able to repair our decks by soldering on replacement female headers from the Bitcraze store. I wish now that the long pin headers were available back then.
In summary, this course was an inspiring experience and helped our team learn a lot in a very short time. After ten weeks working with the Crazyflie, I can strongly recommend the Crazyflie for robotics classes and boot camps.