Author: Kimberly McGuire

The AI-decks are back in stock! Also, last week we had our quarterly meeting, where we plan our focus for the next quarter. As it is also the start of the fiscal year, we also take this opportunity to update our 1 year and 3 year plans as well. We have a big plans coming up, but one of the important focuses that we will have this year, is to get the AI-deck out of early-access!

But what would be necessary for such a task? The AI-deck is one of the most complicated boards we have worked with, so do we have to evaluate its ‘out-of-readiness’ along the same standards than any of our other products.

Mega AIdeck Tutorial

So one of our idea is to be able to achieve a state of the AIdeck in order to write a mega AIdeck tutorial series. This implies that we are able to show how somebody could go from a datasets all the way to a flying aideck-crazyflie combo. Such a series could consist of the following topics:

  1. How to go from a dataset of images to a (Deep) Neural Network
  2. Testing the DNN on the computer with the Image WiFi examples
  3. Converting the neural network to Tensorflow Light (with basics on Edge AI and quantizing neural networks)
  4. AIdeck basics (How to access the camera, how to add the network to the cluster, how to send commands)
  5. Build and flashing the AIdeck and testing it out in the hand
  6. Attach the AIdeck to the crazyflie, make an app-layer application to fly and react on the image input.

From the first look of it, this sounds like it should be easy to do right? Actually, there are still much to be done in order to make this tutorial possible.

Replumbing the Communication

One of the more challenging aspects of the AIdeck as it now is, is that users need to buy a JTAG-enabled programmer in order to flash the GAP8 and the NINA module of the AI-deck. That is the reason why currently the AIdeck has these 2 x 10 pin jtag connectors attached, but ideally we would want to get rid of it completely. This means is that we need to have over air flashing of the GAP8’s binary and that the intercommunication of the NINA and AIdeck will become even more important.

Moreover, the communication protocol from the GAP8 to the STM32 of the Crazyflie is currently very basic, as of right now, it is only possible to send single characters. It might work in some situations, but what if you would like to send an array of values back to the Crazyflie, like the collision probability & steering angle like in PULP platform’s implementation of Dronet? And, would we like to keep on using two UART serial ports or perhaps just relay both NINA and GAP8 communication all through one? The later will make it easier for us to maintain the crazyflie-aideck communication but can perhaps introduce communication delays.

These are just a slice of the type of re-plumbing work for the AIdeck before we can even start our dream tutorial series, but at least it will give you an idea of what we are dealing with :)

Rik the Intern

From this week we have the honor of hosting Rik Bouwmeester for a couple of months. He is currently doing his Master Thesis at the MAVlab from the faculty of Aerospace Engineering of the TU Delft. Since he has experience of working the AIdeck before, he will be able to provide us with some user perspective and help us with the above mentioned issues. You can expect a blogpost from him soon!

How to handle our documentation has been always a bit of struggle. For almost 2 years (see this blogpost and this one) we have working on improving the documentation structure, with by transferring information from the wiki, putting information closer to the code and setting up automating documentation. A few months ago, we managed to have automated logging and parameter documentation (see this blogpost).

Even though we think there is some improvement already, it can always be better! We have noticed that some of our users are a bit confused of how to go through our documentation. So in this blogpost we are discussing some navigational strategies of how you can maneuver yourself through the documentation as it is presented on bitcraze.io, which can also be found here.

Ecosystem-based navigation

So more than a year ago, we also started with a Ecosystem overview page, which are meant to take first-timers by the hand through the Crazyflie ecosystem.. This type of overview pages are starting from the three main pillars: the Crazyflie Platform, the Clients and Positioning Technology. This is a type of navigation that we mostly advise to take if you are a beginner Crazyflie user who do not know the structure of the eco system fully.

The Crazyflie platform page consist of all the important elements of the Crazyflie itself. It points to which hardware components the Crazyflie has, mainly the STM32 and NRF51 processor. It also points to the the existing expansion decks with their specifications and combination possibilities. Moreover, it refers to the family tree, which currently consist of the Bolt, Roadrunner and, of course, Crazyflie 2.X. Crazyradio and Clients overview page splits up the elements in the Crazyflie Python client & library, documentation about the Crazyradio PA, and the mobile clients development documentation for both Android and IOS. And finally, the positioning technologies overview page links to the information pages of the Lighthouse Positioning System, Loco Positioning System and the Motion capture system (also check out this blogpost).

Ecosystem-based documentation navigation tree

Repository-based navigation

For those that already have experience with the Crazyflie and its Ecosystem, the previous way of navigating through the docs might be a bit convoluted. With the Ecosystem-based navigation, it takes about 3 scrolls and clicks to reach the STM development documentation, which is a bit to much of a round way if you already know what you are looking for. We have made the repository overview page not for this purpose but we actually started using ourselves a lot within the company, as a direct pathway to the development repository per element. So this is a page that would be useful to other advanced developers as well!

So the repository overview page is split up in 4 main categories: Python-based software, C-based firmware, Other languages and bootloaders. See the navigation tree which of those repositories approximately point too. By the way, have you noticed that repository documentation has a gray header (like this one) and all the overview pages on the web have a green header (like this one)? This are meant to make you aware if you are still on a fluffy overview website page or going in the nitty gritty details of the development documentation.

Repository-based navigation tree

Feature-based navigation ?

Still a remaining problem is that the repository documentation might not be enough to get a good overview. Where do you need to look if you are interested in ‘controllers’ or ‘state estimators’, or how to make an app-layer application? Currently all of this is within the stm32 firmware documentation, as that is the exact location of where all of this is implemented. But how to document spanning features like the CRTP, where not only the STM chip but also the NRF, Crazyradio PA and the Crazyflie python library are also involved? Or how about the loco positioning system, where the Crazyflie communicates through the LPS deck with a separate LPS node?

So perhaps a good way how to present all this information, is to do it feature-based, like ‘controllers’, ‘positioning’, ‘high level commander’, where we present a structure that points to parts of the detailed documentation within the repo-docs. With ecosystem-based, or even repository-based, navigation documentation strategy, it will take for instance 4-7 clicks to come to the specific controller page, as you can verify by looking at the bread-crumb of the header. Perhaps splitting it up based on feature instead of Ecosystem elements or programming language might be a more logical structure of the current state of the Crazyflie documentation.

Feedback

One reason why it is so difficult to do this properly, is that we have a lot of repositories based on each microprocessor of all of our products, which makes our opensource projects quite unique. It is therefore difficult to find another opensource project of which we can take inspiration from. So, let us know what you would prefer for navigating through our documentation in this poll, but we are always open to other suggestions! If you know of any example of a similar opensource software project that is doing it the right way, or have any other tips, send us an email (contact_at_bitcraze.io), contact us on social media platforms or post a comment on this blogpost!

Update 2021-12-21:

The poll is closed and this is the result! Thanks all for responding!

Forms response chart. Question title: What type of documentation navigation would you prefer on bitcraze.io. Number of responses: 9 responses.

It feels like the day of yesterday, when Arnaud, Tobias and Marcus came together in a backyard shed and decided to go fully in developing the Crazyflie 1.0 and start a business. The result that came from that was Bitcraze, and after 10 years we are still here! Also in the course of the next months, we will be releasing more information about the history of Bitcraze on the blog or social media.

But more important we would like you to leave the 19th-21th of October open in your agenda for the grand party, because we will be organizing our own multi day convention: BAM days a.k.a. Bitcraze Awesome Meetup days.

Of-course we did everything we could to come to abbreviation BAM as you can see :)

This event will be fully online and will be filled with (guest) talks, workshops but most importantly: Fika time, which is coffee breaks in Swedish. We really want to put the emphasize on networking and the coffee breaks as we considered the most important part of any conference, seminar or convention. So in between the talks, which will be in a video chat format like Zoom or Google Meet, there will be equally long coffee breaks in spatial chat format as like Gathertown, Mozilla Hubs or MiBo. We are currently browsing to several online event portal alternatives to accommodate all of this to make sure that everything goes smoothly!

We are currently still building up the program and inviting speakers to give talks and workshops. Moreover, we probably will prepare workshops and demos ourselves as well! So please fill in this interest form if you want to receive more information the event, and give us some pointers of the content. Also check out the event page for any current information.

And in the mean time, make sure to keep the 19th to the 21th of October free in your agenda so that you can come and celebrate our 10 year anniversary with us!

As you have noticed, we talk about the lighthouse positioning a lot these last couple of months ever since we got it out of early release. However, it is good to realize that it is not the only option out there for positioning your Crazyflie! That is why in this blog-post we will lay out possible options and explain how they are different/similar to one another.

The four possible ways to position the crazyflie

Absolute Positioning / Off-board Pose Estimation

Absolute Positioning and External Pose Estimation with the MoCap System

The first we will handle are the use of motion capture systems (MoCap), which resolves around the use of InfraRed cameras and Markers. We use the Qualysis camera ourselves but there are also labs out there that use Vicon or Optitrack. The general idea is that the cameras have an IR-light-emitting LED ring, which are bounced back by reflective markers that are supposed to be on the Crazyflie. These markers can therefore be detected by the same cameras, which pass through the marker positions to an external computer. This computer will have a MoCap program running which will turn these detected markers into a Pose estimate, which will in turn be communicated to the Crazyflie by a Crazyradio PA.

Since that the positioning is estimated by an external computer instead of onboard of the crazyflie, a MoCap positioning system is categorized as an off-board pose estimation using an absolute positioning system. For more information, please check the Motion Capture positioning documentation.

Absolute Positioning / On-board Pose Estimation

Absolute Positioning and Internal Pose Estimation with the Lighthouse and Loco Positioning System

The next category is a bit different and it consists of both the Loco positioning system and the Lighthouse positioning system. Even though these both use beacons/sensors that are placed externally of the Crazyflie, the pose estimation is done all on-board in the firmware of the Crazyflie. So there is no computer that is necessary to communicate the position back to the Crazyflie. Remember that you do need to communicate the reference set-points or high level commands if you are not using the App layer.

Of course there are clear differences in the measurement type. A Crazyflie with the Locodeck attached takes the distance to the externally placed nodes as measured by ultra wide band (UWB) and the Lighthouse deck detects the light plane angles emitted by the Lighthouse Base Stations. However the principle is the same that those raw measurements are used as input to the Extended Kalman filter onboard of the Crazyflie, and outputs the estimated pose after fusing with the IMU measurements.

Therefore these systems can be classified as absolute positioning systems with on-board pose estimation. To learn more please read the Loco and Lighthouse positioning system documentation!

Relative Positioning / On-board Pose Estimation

Relative Positioning and Internal Pose Estimation with the Flowdeck V2.

It is not necessary to have to setup an external positioning system in your room in order to achieve a form of positioning on the Crazyflie. With the Flowdeck attached, the Crazyflie can measure flows per frame with an optical flow sensor and the height in millimetres with a time of flight sensor. These measurements are then fused together with the IMU within the Extended Kalman filter (see the Flow deck measurement model), which results in a on-board pose estimation.

The most important difference here to note is that positioning estimated by only the Flowdeck, will not result in a absolute positioning estimate but a relative one. Instead of using an external placed system (like MoCap, Lighthouse and Loco) which dictate where the zero position is in XYZ, the start-up position the Crazyflie determines where the origin of the coordinate system is. That is why the Flowdeck is classified as a Relative Positioning System with On-board Pose Estimation.

IMU-only On-board Pose Estimation ?

Oh boy… that is a different story. Theoretically it could be possible by using the onboard accelerometers of the crazyflie and fusing those in some short of estimator, however practice has shown that the Crazyflie’s accelerometers are too noisy to result in any good pose estimation… We haven’t seen any work that has been successfully to achieve any stable hover on only the IMU of the Crazyflie, but if you have done/see research that has, please let us know!

And if you would like to give a go yourself and build an estimator that is able to do this, please check out the new out of tree build functionality for estimators. This is still work in progress so it might have some bugs, but it should enable you to plugin in your own estimator separate from the Crazyflie firmware ;)

Documentation

We try to keep keep all the information of all our positioning systems on our website. So check out the positioning system overview page to be referred to more details if you would be interested in a particular system that fits your requirements!

Now that the Lighthouse deck is out of early access and we have made it easier to setup a lighthouse positioning system, we are currently at the next stage: showing how awesome it is! We feel that there are not enough people out there that know about the Lighthouse positioning system and sometimes confuse it even with the Loco position system (to be honest, the abbreviation LPS makes it challenging). But we are confident that the Lighthouse system is a good alternative for those that want to do drone research but are on a tight budget.

The area of the data collection. from the paper

Lighthouse Dataset

During Wolfgang Hönig‘s time here at Bitcraze, one of the bigger projects we worked together on was to generate a dataset comparing the positioning quality of the Lighthouse system with a Motion Capture (MoCap) system. You could imagine that would be a difficult task, since as the lighthouse basestations transmit infrared light sweeps and MoCap cameras by default also emit IR light which are reflected back by markers. However, with the Active marker deck for the Qualysis system, we were able to use the MoCap and Lighthouse positioning without too much interference.

Moreover, Wolfgang also helped out with improving the logging quality on the Micro-SD-card deck which also enabled us to get as much data real-time as possible. He wrote a blogpost about event-based logging a few weeks ago which is a new approach to record data on the Crazyflie at a fast pace. With the Active Marker Deck, the Micro-SD-card deck and of course the Lighthouse deck, … the Crazyflie turn into a full-blown positioning data-collection machine!

The configuration of the Crazyflie with the Micro-SD-card deck, the Lighthouse-deck from the lighthouse dataset paper

Paper

About this whole process, we wrote the following paper:
Lighthouse Positioning System: Dataset, Accuracy, and Precision for UAV Research,
A.Taffanel, B. Rousselot, J. Danielsson, K. McGuire, K. Richardsson, M. Eliasson, T. Antonsson, W. Hönig, ICRA Workshop on Robot Swarms in the Real World, Arxiv 2021

This paper contains an short explanation of the lighthouse system, how we set up the data collection and an analysis of the results, where we compared both Lighthouse V1 and V2 with the Crossing beam (C.B.) method and the extended Kalman filter. In all cases, the mean and median Euclidean error of the Lighthouse positioning system are about 2-4 centimeters compared to our MoCap system as ground truth.

Check out the lighthouse dataset paper to read all the details of the experiments!

The Euclidean Error of both LH1 and LH2 with Mocap as ground-truth taken from the dataset paper.

ICRA Swarm Workshop

Our paper is selected for a poster presentation at the ICRA 2021 Workshop: Robot Swarms in the Real World. So if you have any questions about the paper, please join and ask us in person! The workshop will be held on the 4th of June.

Moreover, we also are sponsoring the event by giving away a Lighthouse Swarm Bundle to whomever wins the best video-demonstration award! So to all the participants, the best of luck! We are super curious to what you’ll have to show us.

Ever since the AI-deck 1.x was released in early access, we’ve been excited to see so many users diving in and experimenting with it. The product is still in early access, where the hardware is deemed ready but the software and documentation still needs work. Even so, we try to do our best to make the product as usable as possible. We’re happy to see some of our users doing great stuff, like the pulp-platforms latest paper “Fully Onboard AI-powered Human-Drone Pose Estimation on Ultra-low Power Autonomous Flying Nano-UAVs“.

Firmware and Examples

The AI-deck consists of the GAP8 chip developed by Greenwaves Technologies. On their website there’s an explanation of development tools where you get a general understanding of what you can use. Also their GAP SDK documentation explains how to install and try out some of their examples as well, on both a GAP8 simulator on the computer or on the GAP8 chip on the AI-deck itself.

We also host an separate repository for some AIdeck related examples which runs with the GAP SDK.

Documentation and Support

Recently we also added the AIdeck documentation to the Bitcraze website, generated from the docuemtnation already available in the Github repository. There’s still improvements to make, so if you find any issues or any additions needed, don’t hesitate to help us improve it. On the bottom there is an ‘improve this page’ link where you can give the suggested change, or notify us by posting on the issue list of the AIdeck repository.

Also note that we have a separate AI-deck category on our forum where you can search for or add any AI-deck related questions. Remember that posting the issues that you are having will also help us to improve the platform and hopefully soon get it out of Early Access.

Workshop by PULP-platform

On the 16th of April we hosted a workshop given by PULP-platform featuring Greenwaves Technologies. In the workshop the an overview of the AI-deck and GAP8 was given as well as going through some basic hands-on exercises. About 70 people joined the workshop and we were happy it was so well received.

The workshop is a great source of information for anybody who is just getting started with the AI-deck, so have a look at the recordings on Youtube and the slides on the event page. Also make sure to check out their PULP training page for more tools that also can be used on the AI-deck. A big thanks to the PULP-platform and Greenwaves Technologies for taking part in the workshop!


Also we would like to ask if anybody who joined the workshop, to fill in this small questionnaire so that we can get some more feedback on how it went and how we can improve for the next one.

A few weeks ago we released version 2021-03 including the python library, Cfclient and the firmware. The biggest feature of that release was that we (finally) got the lighthouse positioning system out of early access and added it as an official product to the Crazyflie eco system! Of course we are very excited about that milestone, but the work does not end there… We also need to communicate how to use it, features and where to find all this new information to you – our favourite users!

New Landing Page

First of all, we made a new landing page for only the lighthouse system (similar to bitcraze.io/start) we now also have bitcraze.io/lighthouse. This landing page is what will be printed on the Lighthouse base station box that will be available soon in our store, but is also directly accessible from the front page under ‘Product News’.

This landing page has all kinds of handy links which directs the user to the getting started tutorial, the shop page or to its place within the different positioning systems we offer/support. It is meant to give a very generic first overview of the system without being overloaded right off the bat and we hope that the information funnel will be more smooth with this landing page.

New tutorial and product pages

For getting started with the lighthouse positioning system, we heavily advise everybody to follow the new getting started tutorial page, even if you have used the lighthouse system since it’s early access days. The thing is is that the procedure of setting the system up has changed drastically. The calibration data and geometry are now stored in persistent memory on-board the Crazyflie and the lighthouse deck itself is now properly flashed. So if you are still using custom config.mk, hardcode geometry in the app layer or use get_bs_geometry.py to get the geometry… stop what you are doing and update the crazyflie firmware, install the newest Cfclient, and follow the tutorial!

We also already made some product page for the Lighthouse Swarm bundle. Currently it is still noted as coming soon but you can already sign up to get a notification when it is out, which we hope to have ready in about 1-2 month(s). The lighthouse deck was of course already available for those that can not wait and want to buy a SteamVR base station somewhere else. Just keep in mind that, even though the v1 is supported, in the future we will mostly focus on the version 2 of the base stations.

Video tutorial

Once again we have ventured into the land of videos and recorded a “Getting started with the Lighthouse positioning system” tutorial for those who prefer video over text.

Feedback

We love feedback and want to improve! Please don’t hesitate to contact us on contact@bitcraze.io if you have comments or suggestions!

When I was started my Robotics Master back in 2012, I remembered how frustrated I was at the time to setup my development environment in Windows for the C++ beginners course. My memory is a bit fuzzy of course but I remembered it took me days to get all the right drivers, g++ libraries right, and to setup all in the path environmental in Eclipse at the time. Once I started working on Ubuntu for my Master thesis, forced to due to ROS, I was hooked and swore I will never go back to Windows for robotics development again… until now!

I always used Windows on my personal machine on the side for gaming and have a dual-boot on the work computer for some occasional video editing, but especially I had begun to learn game development for Fun Fridays, I started to be drawn to the windows side of the dual boot more and more. But if I needed to try something out on the Crazyflie or needed to debug a problem on the forum, I needed to restart the computer to switch operating systems and that was starting to become a pain! Slowly but steadily I tried out several aspects of the crazyflie ecosystem for development on Windows 10 and actually…. it is not so traumatic as it was almost 10 years ago.

Python Library and Client

It went quite smooth when I first try to install python on windows again. Adding it to the PATH environment variable is still very important but luckily the new install manager provides that as an option. Moreover, Visual Studio Code also provides the possibility to switch between python environments so that you try different versions of python, but for now version 3.8 was plenty for me.

With the newest versions of the Windows install of Python, pip is by definition already installed, but I experienced that it would still be necessary to upgrade pip by typing the following in either a Command Prompt or (my favorite) Powershell:

python -m pip install --upgrade pip

After this, install the cflib from release was quite an ease (‘pip install cflib’) but even installing it from source with Git configured on Windows was no problem at all and very similar to a native Ubuntu install.

Until recently the cfclient was a bit more challenging to install through pip from due the SDL2 windows library had to be downloaded separately, so the only options would have been installing from source or the .exe application release. The later has not been updated since the 2020.09 release due to building errors. Luckily, with the latest release, this has now been fixed as a SLD2 python library was found. Now the cfclient can be installed with a simple ‘pip install cfclient’.

Firmware Building with WSL

The firmware development was the next thing that I tried to get up and running, which managed to be slightly trickier. About a year ago I tried to get Cygwin to work on Windows, but my bad memories of the past came back due to the clunkiness of it all and I abandoned it again. Also there are some reported issues with the out-of-tree build (aka the App layer). Some colleagues at Bitcraze already mentioned the Windows Subsystem for Linux (WSL) but I never really looked at it until the need came to move back to Windows for development. And I must say, I wish I had tried it out a while ago.

With some repositories downloaded already on my Windows system with Git, I installed Ubuntu 20.04 WSL, got the appropriate gcc libraries and accessed the crazyflie-firmware by ‘cd /mnt/c/my/repos‘. Building the firmware with ‘make all’ went pretty okay… although it took about a minute which is a little long compared to the 20 secs on the native Ubuntu install. The big problem was that I could not use Docker and the handy bitcraze toolbelt due to the WSL version still being 1. These functionalities were only available for version 2 so I went ahead and upgraded the WSL and linked it to docker desktop. But after upgrading, building the firmware from that same repository on the C:/ drive took insanely long (almost 10 minutes). So I switched back the WSL ubuntu 20.04 to version 1, installed a second WSL (this time Ubuntu 18.04), updated that one to WSL2 and used solely for docker and toolbelt purposes. Not ideal quite yet… but luckily with visual studio code it is very easy to switch the WSL .

But there is more though! Recently I timed how long it took to build the crazyflie-firmware with ‘make all -j8’ from both WSL version in a repository that is on the C:/ drive on Windows (accessible by /mnt/c from the WLS), or from a repository on the local file system:

  • WSL 2 (ubuntu 18.04)
    • C:/ = 11m06s
    • WSL local = 00m19s
  • WSL 1 (ubuntu 20.04)
    • C:/ = 01m04s
    • WSL local = 00m59s

This is done on an Windows laptop with an i7-6700HQ with 32 Gb RAM. The differences with WLS2 between build firmware on the windows file system or the local WSL file system is huge! So that means that the right way is to have WSL2 with the repo on the WSL file system, which is similar to build time as a native install of Ubuntu.

Flashing the Crazyflie

The main issue still with WSL is that it does not allow USB access… So even if the crazyradio driver is installed on windows with Zadig, you will not be able to see if you type ‘lsusb‘ in WSL for both version 1 and 2. So when I still had the repository on the C:/ drive and build the crazyflie-firmware from there I could flash the Crazyflie through the Cfclient or Cflib (with cfloader) through Powershell, but building it from the local subsystem, which is way faster for WSL2, would require to first copy the cf2.bin file to my C drive before doing that.

Another option, although still in Alpha phase, is to use the experimental Crazyradio server for WSL made by Arnaud, for which the user instructions can be found in an issue thread only for now. The important thing is that the Zadig installed driver has to be switched to WinUSB and switched back again to LibUSB if you want to use the Cfclient on windows. It would still needs some work to improve the user experience but gives promise of better integration of WSL development for the Crazyflie.

To Conclude

Soon I’m planning to soon reinstall the Windows part on the dual boot laptop but there are already some things that I will integrate on my freshly installed Windows based on what I experienced so far:

  • Keep using Python on windows and install the Cfclient and Cflib by pip only.
  • Only use Ubunu 20.04 as WSL2 and install the Crazyflie-firmware on the WSL local file system.
  • Use Visual Studio Code as the editor for both C:/ based and WSL based repos.
  • Use the Crazyradio server or copy the bin file to C:/ whenever I need to flash the crazyflie with development firmware.

For any AIdeck development, I would still need to use the native Ubuntu part or the bitcraze-VM since there is not a USB access or server yet for the programmer. However, if Windows would support USB devices and a graphical interface for WSL, that will make all our Windows-based Crazyflie development dreams come true!

About a month ago we released the AIdeck 1.1, which has some slight upgrades and changes compared to the 1.0. Even though the AIdeck 1.1 is still in early access, we do see the number of support questions increase on our forum and in the issue list of the AIdeck example repo. Therefore we are planning to host an AI-deck getting-started workshop by the PULP lab on the 16th of April at 14:00 (Central European Summer time)!

PULP

The PULP lab has done many amazing research on the field of Edge ML and were one of the collaborators in the development of the AI-deck of which their work on the Pulp Shield was the main inspiration. For more information check out their guest blogpost and be sure to read their latest work on the AI-deck!

More over, they have been working on an opensource tools that also work on the GAP8chip which are must try-outs for any AI-deck users

All in all, since they clearly know what they are talking about, they are more than qualified to teach the rest of us how to work with all this! Also check out Luca Benini’s keynote at RISC-V or this week at the TinyML summit if you would like to learn more about PULP!

Date and content Workshop

The workshop will be tailored to those that have just started to to work with the AI-deck however, we think it will be interesting for regular users as well. Note that the tools mentioned above will not be handled this time.

These are the topics that will be discussed:

  • Hardware explanation (Gap8 specifics and AIdeck)
  • Software Preliminaries (GapSDK,, VM)
  • Hands-on examples

The workshop will take approximately 2 hours and will be on 16th of April in the afternoon, but the exact specifics will be given at a later date. So make sure to already block it in your calendar and to sign up for more information!

Sign up for more information

You can sign up to receive more information by giving your email address in this google form. We will also keep you up-to-date on our discord channel and the event page.

It has been a while since we have done an update on the AI-deck! However, as announced in the New Year post, we do have some important changes coming up for the next batch of production. These changes include updated version of the GAP8 chip and switching to a gray-scale camera. We will call this version the AI-deck 1.1, but its firmware should be compatible with the previous version. Currently we don’t have a clear time-line of when we will receive the next batch but it will be soon.

Newer version of GAP8

Together with our collaborators at Greenwaves technologies, we decided to update the GAP8 chip to the latest version available. Since this required a microchip change on the product and to prevent any confusion, we decided to call this version the AIdeck 1.1.

The new version of the GAP8 chip includes several improvements, which resolves 1. conflict problem between uart and hyperbus and 2. Cluster DMA when doing the data transfer between L2 and L1 Memory. The last issue is considered a corner case will only occur if the AI-deck 1.0 is pushed to its limit with a large deep neural network. Although we expect that both issues will not affect the current users if you are experiencing any related problems with the previous version (AI-deck 1.0), please send us an email to contact@bitcraze.io.

Gray-scale Camera

In July we have given an update where we discussed the color camera and how to process it on the GAP8 chip. Even though at the first we were pretty optimistic on the possibilities with the camera, after discussing with the community, ETH zurich and Greenwaves T., we have decided to switch back to a gray-scale camera.

This is because most examples in the gap_sdk repo mostly assume a gray-scale camera. And even though the color camera would be good for image processing with anything of color, it is less ideal for neural networks on edge machine learning devices.

For those who committed themselves to the color camera module, do not worry! We are planning to still offer the color camera module as a separate product in our store! Also we will have a limited amount of grayscale cameras available for those who are unhappy with their AI-decks 1.0’s color camera.

Firmware, Docs and Examples

Since the AI-deck is still in early release, have all the code and documentation in this github repository. This contains all the start-up guides and keeps track of all the bugs and fixes. Some months ago, we managed to update all of this to the latest GAP SDK (3.8.1), which fixed the streamer issues we are having.

However, the quickest way on how to improve this Early Release product is to get feedback by its users. So if you are having any problems at all, do not hesitate to ask a question on to forum or to open up a issue in the repository!