Blog

So we are now back in the cold and dark Sweden after about a weeks visit to a warm and nice Shenzhen, China. Every time we go there something major has happened. When we visited last time, about a year ago, cash was king. Now apparently payments are done with QR codes, even in small lunch restaurants. And I was kind of proud about the BankID and Swish payments we have here is Sweden, until now… Another observation we did was that there are now a lot of colorful rental bikes which can be found about everywhere and which can be rented for around 1 RMB/hour. A great way of resource sharing and pushing Eco-friendly transportation. It has it downside though as piles of bikes could be commonly found and e.g. written about by theguardian.

Aside from the above observations the Maker Faire Shenzhen was one of the reasons we came to visit. As Shenzhen is called the “the silicon valley for hardware” we had pretty high expectations when coming to the Maker Faire. Even though it was a great Faire it did not really reach our high expectations but it is growing fast and I’m pretty sure in a couple of years it is the Maker Faire to be at. A quick summarize, robotics was one of the top categories of products on the faire. 3D printers which are popular on European and US faires was not that common which surprised us. Now let the pictures do the talking:

 

We exhibited on the faire sharing booth with Seeedstudio where we showed an autonomous sequence on top of a table using the flow deck. By pressing a button, the Crazyflie 2.0 would take of, fly in a circle, come back and land roughly in the same spot. It was a very engaging demo catching many peoples attention and especially the kids. The kids constantly wanted to press the button and interact with the Crazyflie.

All the interaction made us very happy and next time we will try to add the obstacle avoidance deck to make it even more engaging.

 

Unfortunately the Crazyradio PA is out of stock in our store and is estimated back around December 1. Until then please checkout our distributors for availability.

 

Grasping objects is a hard task that usually implies a dedicated mechanism (e.g arm, gripper) to the robot. Instead of adding extra components, have you thought about embedding the grasping capability to the robot itself? Have you also thought about whether we could do it flying?

In the GRASP Laboratory at the University of Pennsylvania, we are concerned about controlling robots to perform useful tasks. In this work, we present the Flying Gripper! It is a novel flying modular platform capable of grasping and transporting objects with the help of multiple quadrotors (crazyflies) working together. This research project is coordinated by Professor Mark Yim and Professor Vijay Kumar, and led by Bruno Gabrich (PhD candidate) and David Saldaña (Postdoctoral researcher).

 

 

Inspiration in Nature

In nature, cooperative work allows small insects to manipulate and transport objects often heavier than the individuals. Unlike the collaboration in the ground, collaboration in air is more complex especially considering flight stability. With this inspiration, we developed a platform composed of four cooperative identical modules where each is based on a quadrotor (crazyflies) within a cuboid frame with a docking mechanism. Pair of modules are able to fly independently and physically connect by matching their vertical edges forming a hinge. Four one degree of freedom (DOF) connections results in a one DOF four-bar linkage that can be used to grasp external objects. With this platform we are able to change the shape of the flying vehicle during flight and use its own body to constrain and grasp an object.

Flying Gripper Design and Motion

In the proposed modular platform, we use the Crazyflie 2.0. Its battery life lasts around seven minutes, though in our case battery life time is reduced due to the extra weight. The motor mounting was modified from the standard design, we tilted the rotors 15 degrees. This was necessary as more yaw authority was required to enable grasping as a four-bar. However, adding this tilt reduces the lifting thrust by 3%. Axially aligned cylindrical Neodymium Iron Boron (Nd-FeB) magnets, with 1/8″ of diameter and 1/4″ of thickness are mounted on each corner enabling edge-to-edge connections. The cylindrical magnets have mass of 0.377g and are able to generate a force of 0.4 kg in a tangential connection between two of the same magnets. This forms a strong bond when two modules connect in flight. Note that the connections are not rigid – each forms a one DOF hinge.

The four attached modules results in a one DOF four bar linkage in addition to the combined position and attitude of the conglomerate. The four-bar internal angles are controlled by controlling the yaw attitude of each module. For example, two modules rotate clockwise and other two modules rotate counter-clockwise in a coordinated manner.

 

 

Grasping Objects

Collaborative manipulation in air is an alternative to reduce the complexity of adding manipulator arms to flying vehicles. In the following video we are able to see the Flying Gripper changing its shape in midair to accomplish the complex task of grasping a wished coffee cup. Would you like some coffee delivery?

 

 

 

This work was developed by:

Bruno Gabrich, David Saldaña, Vijay Kumar and Mark Yim

Additional resources at:

http://kumarrobotics.org/

http://www.modlabupenn.org/

We have recently released a few products with optical flow sensors (the Flow deck and the Flow Breakout board) without really talking about the concept of optical flow. So we though we would dedicate this weeks post to it.

The most common example of optical flow is probably a computer mouse. Turning the mouse over you’ll see a strong light that’s used to illuminate the surface so that a camera can clearly see the surface. When running, the camera will identify features in the surface below it and track their motion between frames. As you move the mouse to the left, features will move to the right.  

In the example below you can see a feature being tracked over time.

optical flow

The feature is tracked from frame to frame and the output is the distance that the feature moved since the previous frame. 

The functionality of an optical flow sensor of course depends on being able to find features to track, a surface that is very uniform will be hard to track since all the frames will look the same. If you’ve ever tried using a mouse on a glass table or reflective surface you’ve probably seen that it doesn’t work.

The same concept is used in our Flow products. It also happens that the manufacturer of the chip we use, PixArt, is a world leader in optical mouse sensors. They have applied the same concepts as for the mouse but with a different lens that gives the camera the ability to track features further away (80mm – inf). Like the mouse this is dependent on finding features to track, which might be problematic on poorly lit surfaces or on surfaces that are very uniformly colored. On the other hand if the area is too lit up from the ceiling above you when you fly you might start tracking your own shadow on the floor.

One of the issues with using optical tracking from a flying platform is that you need to know the distance to the features. In the case of the mouse you will know that the features are right under the mouse, but in the case of the flying platform you won’t know this from only looking at the image. Think about sitting on a plane and watching the ground move, it’s really slow. But your movement along the ground will actually be really fast. For our Flow products we’ve added the VL53L0x ToF distance sensor to measure the distance to the surface that’s being tracked. This completest the equation so if you’re further away from the features that are being tracked this will be taken into account. Note that the accuracy of the tracking will decrease when the distance increases since the difference between frames becomes smaller and harder to detect.

An optical flow sensor can also be used to track motion of other objects instead. Suppose the optical flow sensor is fixed and pointing sideways, then it will detect objects passing in front of it, for instance counting people passing a doorway, or it could be used as a touch less mouse.

 

We are going to China and Maker Faire Shenzhen Nov 11-12.

Come and meet us in the Seeed Studio stand, we love talking to makers and geeks!

We have started working on a demo and the plan is to show an autonomously flying Crazyflie using the Flow deck for positioning. If you are in the area, drop by at Liuxiandong Campus, Shenzhen Polytechnic and say hi!

See you there!

 

It has been a while since we have made a blog post about the the community and quite a lot has happened, and is about to happen, so we though we would do an update for this Monday post.

Fred, the Crazyflie android client community maintainer was visiting us last week. He is making great progress on the Java Crazyflie lib that is going to be used in the Android client as well as in PC clients. The lib is still experimental but when finished it will allow to connect and use a Crazyflie from any Java program, there has already been some successful experimentation done using it from Processing

Thanks to Sean Kelly, the Crazyflie 2.0 is now officially supported by the Betaflight flight controller firmware. Betaflight is a flight controller firmware used a lot in the FPV and drone racing community. This is the announcement by theseankelly in the forum:

Betaflight 3.2 was officially released this month. This is the first release that contains the Crazyflie 2.0 target by default, so you don’t need to clone and build from source anymore. It’s available as a target in the betaflight configurator from the google chrome store! I’ve tested it out and it works as expected. Haven’t tested the BigQuad variant, but that’s also available in the app by default.

Thanks to denis on the forum, there is also support for Crazyflie 2.0 in the PX4 flight controller firmware. PX4 is a comprehensive flight controller firmware used in research and by the industry.

The Crazyswarm project, by Wolfgang Hoenig and James A. Preiss from USC ACTlab has been presented at ICRA 2017. It is a framework that allows to fly swarms of Crazyflie 2.0 using a motion capture system.  There is currently some work done on merging the Crazyswarm project into the Crazyflie master branch, this will make it even easier to fly a swarm of Crazyflie. In the meantime the project is well documented and can be used by anyone that has a couple of Crazyflies and a motion capture system.

One of the pain points when setting up the Loco Positioning system is to measure the anchor positions and enter them into the system. I wanted to see if I could automate this task and let the system calculate the positions, and if so understand what kind of precision to expect. I have spent a few Fun Fridays playing with this problem and this is what I have found so far.

The problem can be broken down into two parts:
1. How to calculate the anchor positions. What data is required?
2. How to define the coordinate system. To make it useful the user must to be able to define the coordinate system in a simple way.

Anchor and ruler

How to calculate the anchor positions

The general idea of how to calculate the anchor positions is to set up a system of equations describing the distances between the anchors and/or the Crayzflie and solve for the anchor positions. The equations will be non linear and the (possibly naive) plan is to use the Gauss Newton method to solve the system.

To understand how to calculate the anchor positions we must first take a look at the data that is available. The Loco Positioning system can be run in two different modes: Two Way ranging (default mode) and TDoA.

Two way ranging

In the Two Way ranging mode we measure the distance between each anchor and the Crazyflie and to get enough data we must record ranging data for multiple positions. The anchor positions are unknown, and for each new Crazyflie position we add yet a new unknown position, on the other hand we measure the ranges to the anchors so these are knowns. 

The equations used are simply to calculate the distance between the assumed position of each anchor and the Crazyflie and then subtracting it from the measured distance.

TDoA

In TDoA we measure the Time Difference of Arrival, that is the difference in distance to two anchors from the Crazyflie’s position. It is probably possible to use this information, but I was looking for a different solution here. In our new TDoA implementation that we have been playing with a bit, we get the distance between all anchors (calculated in the anchors) as a side effect. 

In this case the Crazflie is not really needed and the equations describe the distance between assumed anchor positions versus measured distances.

How to define the coordinate system

To get a useable positioning system, the coordinate system must be well defined and oriented in a practical direction. For example when writing a script you probably want (0, 0, 0) to be at some specific spot, the X-axis pointing in a certain direction, the Z-axis to point up and so on. My initial idea was to use the anchors to define the coordinate system, use anchor 0 as (0, 0, 0), let the X-axis pass through anchor 1 and so on. Just by looking at our flight lab I realised that this would be too limiting and decided that the coordinate system should be completely disconnected from the anchor positions, but still easy to define. I also realised that a really good way to tell the system about the desired coordinate system would be to move the Crazyflie around in space to show what you want. The solution is to place the Crazyflie at certain positions and click a button to record data at these positions. The steps I have chosen are:

  1. Place the Crazyflie at (0, 0, 0)
  2. Place the Crazyflie on the X-axis, X > 0
  3. Place the Crazyflie in the XY-plane, Y > 0
  4. Move the Crazyflie around in the space with continuous recording of data

In this scheme the XY-plane is typically the floor.

Results

I have written basic implementations for both the Two Way ranging and TDoA modes and they seem to work reasonably well in simulations. I have also tested the Two Way ranging algorithm in our flight lab with mixed results. The solution converged in most cases but not always. When converging the estimated anchor positions ended up in the right region but some were off by up to a meter. Finally I did run the algorithm and fed the result into the system and managed to fly using the estimated positions which I find encouraging.

I will continue to work on this as a Friday Fun project and maybe it will make its way into the client code base at some point in the future? There are probably better ways to estimate the anchor positions and more clever algorithms, feel free to share them in the comments.

 

 

A few weeks ago we wrote about a new prototype that we call “the obstacle avoidance deck”. Basically it’s a deck fitted with multiple VL53L0x ToF distance sensors that measures the distance front/back, right/left and up of the Crazyflie 2.0. Combined with the Flow deck this gives you an X/Y/Z robot that you can program fly around avoiding obstacles which doesn’t need any external positioning system.

After implementing firmware support for the deck (see #253 and #254) we’ve finally had a chance to do some initial testing, see the video below. In the current implementation we’re doing the measurements in the firmware but using the logging framework to get all the distances into a Python script which does the movement control. Since we have the Flow deck attached we can control the Crazyflie 2.0 in velocity mode, which means we can say things like “Go forward with 0.5 m/s until the forward sensor shows a distance lower than 50cm” or “Go forward 1 m/s for 1s and rotate to measure the distance to all objects”. Since there’s no real-time requirements we can move the complexity of the algorithm from the firmware into external scripting which makes it a lot easier to develop. Now we’re really eager to start setting up obstacle courses and time how fast we can move though them :-)

The results from the testing shows that our two main concerns aren’t an issue: The sensors doesn’t seem to interfere with each other and we can sample them all at high-enough frequency without occupying the bus too heavily (currently we’re doing 20Hz). The next step is figuring out the requirements (i.e how many VL53L0x sensors are needed, do we really need the back one?) and a mechanical solution for attaching the sensors in production. If there’s any feedback let us know now and we’ll try to get it into the design. Also, we really need a new name for the board. Any suggestions?

A long time ago we got a request for a bright LED deck from a community member. When working with high powered leds heat becomes a problem that needs to be taken into account. From the community member we got suggestions of using one of the luxeon rebel leds and so we did. We designed a prototype pretty quickly but also realized that it is a bit harder than we first thought. If using a simple control scheme such as PWM and a mosfet the circuit is simple but brightness will be effected by battery voltage. Using a dedicated LED driver the brightness would be stable but the circuit more complicated and expensive. Trying to list the pros and cons:

MOSFET
+ Low complexity
+ Low cost
+ High efficiency
– varying brightness depending on battery voltage
– Might stress LED (could be solved with low ohm resistor)

LED driver
+ Stable brightness
+ Not as high efficiency (~80%)
– Higher cost
– Higher complexity

We ended up trying booth. The LED driver design failed due to that the battery voltage needed to be lower than the LED voltage + schottky and it is just in the middle. The PWM design half failed since the LED anode and cathode was swapped in the design but was possible to patch afterward. So at least we got something up and running.

The effect is very nice and it is what we used for the wedding show. The question now is, is this something we should finish and put in the store?

It was inevitable. At some point we needed to do a drone show of our own and wouldn’t the best occasion be on one of our coworkers wedding party! Well we have done demos before on maker fairs and conferences but never a one time show. This time we also had the possibility to collaborate with an acrobat, namely Arnauds brother, Adrien. Perfect match! The only problem was that we only had one day to pull something off and no possibility to rehearse anything with Adrien before the live show. Probably a piece of cake for a true artist such as Adrien but for newbies like us it was definitely increasing the stakes. There was no time to wank about that but better to get our hands dirty.

We wanted to create a choreography with a Crazyflie interacting with Adrien as he did some awesome acrobatics. The Crayzflie should be clearly visible with lights and we wanted it all to be synchronized to music. The final requirement was to minimize the equipment required since the wedding was abroad and we did not want to carry too much equipment.

We decided to use the Loco positioning system as a base to be able to do scripted flight and chose Two Way Ranging to reduce the number of anchors needed. We used only four anchors arranged as a 3×3 m square, three of them on the floor and the fourth 1.5 meters up on a tripod. Two Way Ranging works very well outside of the convex hull (which is really small in this case) but with the drawback that we only could position one Crazyflie – no swarm show.
For the visibility we used a prototype deck with a high intensity LED pointing downwards.

Photo by Eric Cunha

Again to minimize the equipment we decided to pre-program the sequence in the Crazyflie and put a start button on it to run the sequence. The piece of music to be used (composed by the groom during the bachelor party) was about one and a half minute long and the idea was to start the music and the sequence manually at exactly the same time and let them run without any further synchronisation.

We based our purpose built firmware on the demo code from ICRA earlier this year. It contained the sequencing but we had to add functionality to control the LED. We wrote a python script that generated data for the sequence that could be pasted into the firmware. A button was soldered on top of the Loco Positioning Deck and connected to a GPIO pin to be used as a start button.

We are engineers and not choreographers but after some trial and error we managed to create a sequence with light that was synchronized to the music. The overall idea was to start the Crazyflie on the floor in the center of the space, go out into the audience during the intro and turn on the light over Adrian (that was carefully positioned on the correct spot) and guide him onto the “stage”. While Adrien improvised acrobatics on the floor the Crazyflie circled around and above him with flashing lights. At the end of the music the procedure was reversed and the Crazyflie guided him back into the crowd and landed quietly in the center of the stage.

Photo by Eric Cunha

After dinner when it was time for the show we set up the positioning system and the crowd gathered. The first two attempts failed as it turned out that the ceiling of the venue was lower than expected and the Crazyflie hit a beam! What to do? We had a pre-programmed sequence that was going too high! We simply moved the anchors 30 cm away from each other and “stretched” the space to trick the Positioning system a bit. We had not tried this before and live hacking is a bit stressful, but we were possibly strengthened by one or two glasses of wine :-)

The third attempt was almost perfect with an incredible performance by Adrien! Phhuuu and what a success on the third attempt. Heart failure was close but did not kill us this time! :-) Maybe this is why we stay away from live shows and rather sitt down in a dark basement coding :-).

At any given time we have a bunch of deck ideas floating around. Some of them might not be doable (or very hard), but still fun to discuss. Other we just never get around to since we’re always pressed for time. The “obstacle avoidance” deck is one of the latter ones.

The idea with the “obstacle avoidance” deck (current working name in lack of imagination) is to mount one of the VL53L0x ToF distance sensors, the same we have on the Z-ranger and the Flow deck, in each direction. This would allow you to keep a distance to the ground, avoid the walls (or any other obstacles you might fly into) and also keep away from the ceiling. Basically you could do a “turtle bot” that just flies around randomly without crashing. Another fun idea we’ve been discussing is being able to SLAM the room you’re flying in. If you can keep track of how you are moving around (with the Flow deck, Loco positioning system or any other means) while you’re measuring the distance on all sides you could make a map of the room.

After discussing this on and off for a some time, mainly focusing on mechanical and production issues of the design, we decided to just try out the concept with a simple prototype. The prototype, named “OA”, has daughter boards with VL53L0x sensors mounted front/back, left/right as well as one sensor facing up. It’s designed to be mounted on the top of the Crazyflie 2.0 and combined with the Flow deck which will give relative movement and also the sixth direction, distance to the floor. One of the issues with the design is that all the VL53L0x sensors are on the same I2C bus with the same address. To work around this the sensor has a nifty feature where you can re-program the I2C address. For this to work you need to release the reset of the sensors one by one: release the first reset, reprogram the address and then release the reset of the next sensor. The reset for the VL53L0x is not cabled on the Flow deck, so this is the first to be re-programmed. Then the reset will be released one-by-one for the sensors on the OA deck. In order to control the reset pins on the deck there’s a 8bit I2C GPIO expander. The reason for the GPIO expander is to use as few GPIOs on the deck connector as possible to keep the compatibility with other decks high. For instance the deck will work fine with the Loco positioning deck.

 

The goal with the prototype is to try out the concept of the deck and to see if it’s feasible. A few of the things we need to sort out is:

  • Mechanical solution for side senors (front, back, left and right)
  • Interference between sensors
  • Update rate when we have 6 sensors on the same bus which we might have to run one-by-one to avoid interference

The current status is that we’ve verified the electronics and written the I2C GPIO expander drivers to test all the sensors. The next step is to work on a new VL53L0x driver to allow multiple sensors running at the same time, which will force some refactoring of the firmware.  Once we’ve made some more progress we’ll do another post and report the results. If you have any feedback on the design/concept or have any ideas of what the deck could be useful for, don’t hesitate to drop a comment below.