Blog

There are a lot of things we need to keep track of in Bitcraze, such as customer support, production, book keeping, planing for the future, improving how we work, our servers, forum, the community, source code, new hardware, just to mention a few areas. There has been an informal distribution of these areas between the persons in the company, but it has not been very clear who is responsible. This has led to that we all worry about everything but we still miss to catch some issues. So we have two bad properties: not as good performance as we would like to have and negative feelings. So what to do?

We have decided to set up role contracts. Our version of role contracts is simply to appoint one person to each area that we feel needs continuous attention. The person that is responsible for an area is expected to keep track of what happens to that area, think about how we should improve or change, and notify us (the rest of the team) when we need to take some sort of action. That person is not expected to execute or implement, but make sure that we get it done as a team.

We will update the role contracts at regular intervalls, every month as a start, to make sure we are working with stuff we passionate about and so that we have a dynamic team.

Last Thursday we went to LTH (University of Lund), to the Robotics department, to make some measurements on the ultra wide band (UWB) positioning system we are working on. The idea was to use one of their robots to move a Crazyflie around along well known path, and at the same time record as much sensor data as possible. This would give us data that we can crunch offline.

We placed four anchors around the robot and our positioning expansion deck on the Crazyflie. The Crazyflie collected the data from the positioning deck as well as its internal sensors and streamed the data it over USB to an external computer for storage. We collected the following data:

  • Distance to the anchors (raw measurements)
  • Air pressure at the anchors
  • Air pressure at the Crazyflie
  • Accelerometers
  • Gyros

The logs from the robot will give us the real position of the Crazyflie as well as the anchors, and from that we will be able to evaluate the performance of algorithms that use the sensors to figure out the position.

The dataset will be shared with the researchers at Lunds university, they have some interesting ideas they want to try out.

Next step is to crunch the data…

For more information on our UWB positioning project, see Firmware and dwm 1000 nodes

For a background on the “How we work” category, see Self organizing

Next on the agenda was to talk about feedback. Feedback is the oil that keeps the team happy and enable individuals to gain insights about them selfs. Important stuff!

We talked about the self and some techniques/models that can be used to understand our selfs and others (e. g. Johari Window). We discussed how we react when we receive feedback, and what we can do to increase the possibilities that we really listen to and understand the feedback that is given to us. These insights also help us to give feedback to other persons.

The last part of this was a “recipe” of how to compose feedback for someone. Try to include the following

  • Observation of a behaviour
  • How that makes you feel
  • Your needs that created the feeling
  • Your wished of alternative behaviour

Some slides are available at Prezi

Finally we practiced by giving everyone else feedback one-on-one. We will continue to give each other feedback over the coming weeks on a regular basis – practice makes perfect.

Marcus and Arnaud have packed up and left the Berlin Maker Faire after two hectic days. and last weekend Tobias and I were at Maker Fair New York. It’s a lot of fun showing the Crazyflie, meeting people and getting feedback from the community!

MF15NY_Badge1icon_Berlin

This time we had created a demo where we used a webcam to track the position of the Crazyflie (see older blog post). I really like the demo, it’s pretty minimalistic and shows the awesome capabilities of the Crazyflie as a platform that enables the user to explore and develop on top of it. Just add a webcam, AR-tag some control algorithms, imagination and engineering and you have an autonomously flying Crazyflie!

When Tobias and I arrived at the New York Maker Faire on Friday afternoon to set up the demo, we discovered that our booth was not indoors, but under a roof without walls. And it was windy!

mfny-tent mfny

There was no way the control algorithms could manage to keep the Crazyflie at its taget position, so what to do? We tried various ways of tethering it with a string while still flying, but without any success. We had to resort to hanging the copter in a string, but not really flying it. To at least get some use of the positioning system, we used the position of the hanging copter to change the color and intensity of LED-rings on four other copters that we put in the corners of the test rig. Not what we hoped for but what do you do?

My favourite gadget at the New York fare was the Pancake Bot – a must in every home!

pancakebot

The Berlin Maker Faire was organized for the first time last weekend. The venue was a bit smaller than other Maker Faires we’ve been to, but there was lots of interesting visitors and it was held in a very nice old railway-postal building. Fred from our community joined us during the fair and gave us some much needed support in the booth. As for the demo it was working better, but we still had some issues with lighting and the detection of the AR-tag. Room for improvement.

IMG_20151003_111805_small

The fair had a drone area, but unfortunately not a lot of drones. We were flying the Crazyflie 2.0 as well as a bigger quad using the Big Quad deck. Aside from that Fred brought a long his even bigger quad that uses the Open Pilot CC 3D.

IMG_20151003_140430_small

 

IMG_20151003_140908_small

Big thanks to Fred for showing us Berlin and helping us out!

 

 

This year we decided to save some time and split up for the last two Maker Faires. So last weekend Tobias and Christopher went to the Maker Faire in New York. They had a great but hectic time. More about it in a next post.

Meanwhile me and Marcus are going to Maker Faire Berlin next week end, from 2015-10-03 to 04. We will be standing in the Seeedstudio booth showing the Crazyflie. We’re also planning on having our first ‘meetup’ since we are sure to have at least Fred, the main Crazyflie Android client maintainer :-). We have created a forum thread to discuss it. If you are in Berlin next weekend and want to meetup and talk to us and other Crazyflie enthusiasts drop a line in the forum. The meetup will be somewhere in Berlin so do not hesitate to join even if you do not attend the Maker Faire. We will announce the exact time and place on the forum, it will certainly be on Saturday the 3rd.

icon_Berlin

On an other subject, Wolfgang, from USC, is having a talk at the IROS conference. He is going to present his research on Mixed Reality, where one of the platforms he’s using is the Crazyflie 2.0. We have visited Wolfgang’s lab and University before the Maker Faire Bay Area and we where really impressed, they are doing great research. Here’s the info on the talk:

Event: International Conference on Intelligent Robots and Systems (IROS), Congress Center Hamburg, Germany
Talk: “Mixed Reality for Robotics,” October 1, 12:20 – 12:35
Room: Saal A4
Abstract: link

IiROS logo

Last week and this week is busy with preparations for the New York and Berlin maker faires. Since we will be in the Seeedstudio booth we don’t have the same space as at the Bay Area Maker Faire, so we had to rebuild our “fly-cage”. The new specs are 1.7 x 0.7 x 0.7 meters. This is the area the Crazyflie 2.0 should be able to fly in for a full charge without touching the sides on the net.

We don’t have any special plans during the faire, except for flying during the day. So if you feel like meeting up, having a beer and getting lost in various technology discussions then leave a comment or drop us a mail.

The autonomous flying rig we used in bay-area was using the Kinect 2 sensor. This new rig is only using a standard webcam which is cheaper and easier to manage (ie. we do not need a Windows computer anymore). We are attaching an augmented reality marker on the top of Crazyflie and the image processing is mostly done by the ArUco library. ArUco is detecting the position of the Marker in 3D and the position is sent via zeromq to the controller. We used the same controller code as for the Kinect, we just had to tune it a bit better to keep in the smaller space. Then the controller is sending pitch/roll/yaw to the Crazyflie client setup to have a ZMQ as input device.

CPBrmPrUAAAKkEj

If you want to build the same cage then here’s a list of the parts:

  • Some kind of net (we used normal fishing net)
  • Fishing line (to tighten the cage)
  • Aluminium beam (for tents)
  • These 3D printed parts
  • Webcam with standard camera attachment (we use Logitech C920)
  • Camera attachment screw

We are in the process of cleaning up the code for the webcam. It will be pushed on Github and we will document the build on the Wiki.

We have decided to use Travis for continuous integration builds of our open source repositories. Travis is automatically building the code on all branches and pull requests, which gives all developers that wants to contribute to the project, the possibility to see that their code passes the build. The current status of the latest build on the main branch, is visible through the icon in the readme in github, or on the Bitcraze page at travis.

travis-ci

The projects we have added so far to travis are written in C or python. The C projects for instance, must be compiled with special compilers for the processors used in the crazyflie which adds some extra complexity. We have created a docker image (bitcraze/builder) with the tools needed, to make life easier for developers. If you use the image when developing, there is no need to install tools locally, and the same image is used in travis builds, so you know you will get the same results as the CI-server. This also removes the problem of tools with different versions (and results) in the development- and build environment.

To use the image you can for instance type

docker run --rm -v ${PWD}:/module bitcraze/builder make

Event though it is awesome to be able to create a well known build environment through a docker container, we feel that too much typing is needed to execute a simple make.  To solve that problem we are looking at the possibility of creating a toolbelt that will handle that for you. More information on that later on, for now developers will have to find their own solutions through scripting, aliasing or other means.

Obviously you need Docker to use this image. If you have not tried it out yet, take a look at www.docker.com.

We are aiming for automated testing of our code, and even though we have a lot of work to do, we have taken the first baby step. For the moment, firmware projects are simply compiled and linked to ensure that the code is coherent. Projects that support both crazyflie 1 and 2 are built in both flavours to avoid problems for developers that might only use one of the platforms.  The python client project is only checked for PEP8 compliance, but we are looking at how to unit test. Any input from the community is welcome!

Happy hacking!

During the last week we’ve taken a big step, moving to Python 3! The reason for the move is that Python 3 is becoming broadly adopted and it has more features that we want to make use of. Also 3 > 2. This post will explain a bit of what we did, some of the problems we encounters and the current status. The numbers 2 and 3 will be thrown around a lot in the text, but to precise we’re talking about versions 2.7+ and 3.4+ (even more precise it’s been tested on 2.7.9 and 3.4.3). The next release of the client will run on Python 3, but if you want to test it now just clone the development branch on GitHub.

Status

If you have developed applications using the API and Python 2 then you might be getting a bit worried right about now. The compatibility for both Python 2 and 3 will be kept for most things, except for the client:

This will be compatible with both 2 and 3:

  • The Crazyflie Python API (everything in lib/cflib)
  • The examples for the Crazyflie Python API (everything in examples)
  • The ZMQ server using the Crazyflie Python API (bin/cfzmq)
  • The Crazyflie command-line bootloader (bin/cfloader)

But the main clients will only have Python 3 compatibility:

  • The Crazyflie Python client (bin/cfclient)
  • The Crazyflie Python headless client (bin/cfheadless)

API Examples

While doing the porting we’ve also added more examples to cover more of the Crazyflie Python API. In order to keep 2/3 compatibility for the API it’s important to be able to test it easily with the different versions. We are having unit-tests on the TODO-list, but until then we’ve been using the API examples to test. All the examples should run with both Python 2 and 3. It’s also a good thing with more examples showing how to use the API…

Porting and compatibility

The approach we used was to first run the 2to3 utility to automatically to as much as possible of the porting. After that we had to fix the rest of the errors manually and also maintain the dual 2/3 compatibility of the API.

In our previous implementation we made use of strings to store binary data that we were sending/receiving. But because of incompatibilities between Python 2 and 3 this didn’t fit very well. To make things neat for the API we found a container where we could store bytes that works with both Python 2 and 3, the bytearray. Even though we use the same type, there’s still some subtle differences in usage between the versions. After doing some testing we found ways where the syntax was the same for Python 2/3.

First of all bytearrays can be created from a string, tuple or list. When indexed by the [] operator it will give you the value of each byte.

>>> d = bytearray([i for i in range(10)])
>>> d
bytearray(b'\x00\x01\x02\x03\x04\x05\x06\x07\x08\t')
>>> d[5]
5

The main point is getting something meaningful out of the bytearray when doing the communication, here’s a few examples:

Unpacking a byte, an integer and a word from the first 7 bytes (little endian)

>>> struct.unpack("<BIH", a[:7])
(0, 67305985, 1541)

Getting a string from a subset of the data can be done by using decode and the char-set to use for decoding. We use ISO-8859-1 since the Crazyflie does not support Unicode (yet?).

>>> d = bytearray([i for i in range(97,100)])
>>> d
bytearray(b'abc')
>>> d.decode("ISO-8859-1")
'abc'

You can also easily get a tuple or a list:

>>> list(d)
[97, 98, 99]
>>> tuple(d)
(97, 98, 99)

And you can also concatenate:

>>> d + d
bytearray(b'abcabc')

And find a byte:

>>> d.find(bytearray((98, )))
1

But there’s also a few things we couldn’t get to work in a good way and have to check which version we’re running and execute different code, like the queue import that has changed name.

if sys.version_info < (3,):
    import Queue as queue
else:
    import queue

Another problem we haven’t solved is creating a bytearray from a string, so it’s also

if sys.version_info < (3,):
    self._data = bytearray(data)
else:
    self._data = bytearray(data.encode('ISO-8859-1'))

As for the client code that was ported to Python 3 without keeping the backwards compatibility there wasn’t any big issues. The biggest change was the PyQT4 API where there’s a few things that have improved when placing custom Python data in GUI objects. Before QVariant was used for this. You would create a QVariant object that wrapped the Python object. To get data out from the QVariant again you would have to explicitly say what type it had by calling the correct function (like toInt()). Now this is a lot smoother. QVariant has been skipped and you just use the Python type directly.

For more information have a look here where we found a lot of useful tips. Don’t hesitate to leave a comment if you think we could have done things differently or if you have any tips!

What’s not working

There’s still a few things we’re not sure how to fix and we have to look into it a bit more. These are:

  • There doesn’t seem to be any Python 3 bindings for the Leap Motion. According to this it’s possible to build the bindings yourself.
  • The Python 3 bindings to Marble for the GPS tab hasn’t been investigated yet

PEP-8

On a side note we’ve started using Travis CI (more on this next week) and will start creating unit-tests for the Crazyflie Python  API. As a first step we’re running PEP-8 on all the code. This will be checked automatically for all commits and pull-requests.

 

For a background on the “How we work” category, see Self organizing

We have done a session in the company on group development. We based our discussions on the IMGD model by Susan Wheelan (https://en.wikipedia.org/wiki/Group_development#Wheelan.E2.80.99s_Integrated_Model_of_Group_Development).

We wanted to understand more about the structures and forces in the team, what to expect and gain a better understanding of our everyday experiences.

According to the IMGD model a team evolves through a number of phases as it achieve maturity when it works together. The phases are

  1. Dependency and Inclusion
  2. Counterdependency and Fight
  3. Trust / Structure
  4. Work / Productivity
  5. Final (if there is a distinct final point)

We had some good discussions that we think will help us in the future.

This weekend we went to the maker weekend at Hx in Helsingborg and showed off the Crazyflie 2.0 flying with the Kinect. It’s an awesome demo for fairs since it flies by itself and looks pretty good. Below is a few photos from the event.

But this time we ran into some issues with the set-up. When we first developed this we were running the image acquisition and processing on Linux using libfreenect2, but we later switched to Windows. The reason is these three lines. These lines map the depth measurement to the camera measurements and gives a set of “world-coordinates”. This is needed, since the distances left/right wouldn’t be correct without taking the distance away from the camera into consideration. Without it a left/right movement close to the camera would give a much larger response than one further away from the camera.

So to solve the issue above we moved everything to Windows, which would kind of solve the issue. But we started experiencing lag in the regulation, which originated from lag in the image processing. After doing some more digging we drew the conclusion that there was a USB issue when using the Crazyradio at the same time as the Kinect v2. Once every couple of minutes the FPS for the video will drop really low (which results in CPU usage going down as well). But disconnecting the Crazyflie (i.e not using the Crazyradio) seems to solve this issue. To work around this problem we currently have a set-up where we use two laptops. Since we’re running ZMQ to communicate between the applications it’s a quick operation to split it up on multiple hosts. So the Windows laptop runs image acquisition and processing and the Linux laptop runs the control-loop and Crazyflie client for sending the commands to the Crazyflie.

When we were first developing this there was lost of things happening in the libfrenect2 repo, so this might be implemented now. Does anyone have any tips for this? We would love to be able to run the system fully on Linux with only one laptop :-)

During the upcoming months we will be attending both the New York Maker Faire and the Berlin Maker Faire, hanging around the Seeedstudio booth. So if you want to see the Crazyflie/Kinect demo live or just to hang out and talk to us then drop by!

MF15NY_Badge1 icon_Berlin