Category: Electronic

It has been about a month since the AI-deck became available in Early Access. Since then there are now quite a few of you that own an AI-deck yourself. A new development we would like to share: we thought before that we had selected a gray-scale image sensor. However, it came to our attention that the camera actually contains a color image sensor, which on second viewing of the video presented in this blogpost is pretty obvious in hindsight (thanks PULP project ETH Zurich for letting us know!).

A color image from the AI-deck

This came as a little surprise, but a color camera can also add some new possibilities, like making the Crazyflie follow a orange ball, or also train the CNNs incorporate color in their classification training as well. The only thing is that it will require an extra preprocessing task in order to retrieve the color image, which will be explained in the next section.

Demosaicing

Essentially all CMOS image sensors are gray-scale by definition. In order to retrieve color from a scene, manufacturers add a Bayer filter on top of the image sensor, so it filters out the red, green and blue on each pixels. This Color filter array does not need to be RGB, but all kinds of colors, but we will only talk about the Bayer filter. If the pattern of the filter is known, the pixels that related to a certain color will be interpolated with each-other in order to fill in the gaps in between. This process is called demosaicing and it creates the RGB channels that are converted to a color image.

Process of demosaicing with a Bayer filter

Currently we only implemented a simple nearest-neighbor interpolation scheme for demosaicing, which is fine for demonstration purposes, however is not the best technique out there. Such a simple interpolation is not very ‘edge and detail’ aware and can therefore cause artifacts, like these Moiré effects seen here below. Anyway, we are still experimenting how to get a better image and how to translate that to all the examples of the AI-deck example repository (see this issue if you would like to follow or take part in the discussion).

Moiré effect

So technically, once we have the color image, this can be converted to a gray-scale images which can be used for the examples as is. However, there is a reduction in quality since the full pixel resolution is not used for obtaining the full scale image. We are currently discussing if it would be useful to get the gray-scale version of this camera and make this available as well, so let us know if you would be interested!

Feedback and Early Access

Like we said before, there now quite a few of you out there that have an AI-deck in their procession. As it is in Early Access, the software part is still in full development. However, since we have not received any negative feedback of you, we believe that everything is fine and peachy!

Just kidding ;) we know that the AI-deck is quite a challenging deck to work with and we know for sure that many of you probably have questions or have something to say about working with it. Buying an Early Access product also comes with a little bit of responsibility. The more feedback we get from you guys, the more we can tailor the software and support to help you and others, thereby advancing the product forward and getting it out of the early access phase.

So please, let us know if you are having any trouble starting up by posting a thread on the forum (we have a special AI-deck group!). If there are any issues with the examples or the documentation of the AI-deck repo. We and also our collaborators at Greenwaves Technologies (from the GAP8 chip) are more than happy to help out. That is what we are here for :)

The AI-deck is now available in our online store! Super-edge-computing is now possible on your Crazyflie thanks to the GAP8 IoT application processor from GreenWaves Technologies. GAP8 delivers over 10 GOPS of compute power at exceptionally low power consumption enabling complex tasks such as path-finding and target following on the Crazyflie, consuming less than 0,1% of the total energy.

The AI-deck can host artificial intelligence-based workloads like Convolutional Neural Networks onboard. This will open up many research areas focusing on fully onboard autonomous navigation of tiny MAVs, like ETH Zurich’s PULP-Dronet. Moreover, there is also ESP32 WiFi connectivity with the possibility to stream the images to your personal computer.

We are happy that we managed to get everything ready so soon after our last update. Crazyflie AI-deck is in early access, which means the hardware design is now finalized and full support for building, running and debugging applications (including GreenWaves’ GAPflow tools for porting neural networks to GAP from TensorFlow) is available, however, limited examples of specific AI-deck applications have been developed so far. Read more about early access here. Even though there is still some work to be done, there are already some examples you can try out which we will explain in this blog post. Also, we aim to have all AI-decks pre-flashed with the WiFi streamer example so that you check out right away if your AI-deck is working.

Beware that you need an JTAG-enabled programmer/debugger in order to develop for the AI-deck!

Technical specifications

The AI-deck will come with two elongated male pin-headers, which enables the user to connect it to the Crazyflie with an additional deck. There are two 10 pin JTAG connectors soldered which enables connection with a JTAG-enabled programmer. This will be the main way to program the GAP8 chip and the ESP-based NINA module while it is still in early access.

Getting started

When you first receive your AI-deck, it should be flashed with a WiFi streamer example of the camera image stream. Once the AI-deck is powered up by the Crazyflie, it will automatically create a hotspot called ‘Bitcraze AI-deck Example’. In this repo in the folder named ‘NINA’ you will find a file called viewer.py. If you run this with python (preferably version 3), you will be able to see the camera image stream on your computer. This will confirm that your AI-deck is working.

Next step is to go to the docs folder of the AI-deck examples repository. Try out the WiFi demo and set up your development program with the getting-started guide. This guide contains links to the GAP-SDK documentation from GreenWaves Technologies. You can read more about the face detector example that we demonstrated in this blog post.

It has been a while since we have updated you all on the AI deck. The last full blogpost was in October, with some small updates here and there. It is not that we have not focused on it at all; on the contrary… this has been a high priority project for a while now. It is just quite a complex board with a lot of bells and whistles, which can be challenging to work with sometimes so early in development, something that our previous intern can definitely agree on. So therefore we rather wanted to wait until we were able to make sufficient progress before we gave you an update… and so we have!

A Crazyflie 2.1 with the AI deck

Together with Greenwaves technologies we have been trying to get the SDK of the GAP8 chip on the AI deck stable enough for an early release. The latest release of the SDK (version 3.4) has proved itself to work with relative ease on the AI deck after extensive testing. Currently it is possible to use OpenOCD for flashing and debugging, and it supports most commonly available debuggers with a jtag connector. In the upcoming weeks both of Bitcraze and Greenwaves will test and try out all examples of the SDK on the AI deck to make sure that everything is still compatible. Also the documentation will be extended as well. As there is so much to document, it might be difficult to catch all of it. However, if you notify us and Greenwaves on anything that is missing once the AIdeck is out, that will help us out to catch the knowledge gaps.

The AI deck also contains the ESP-based NINA module for establishing a WiFi connection. This enables the users to stream the video stream of the AI deck onto their computers, which will be quite an essential tool if they would like to generate their own image database for training the CNNs for the GAP8 (and it happens to also be quite practical for debugging by the way!). Currently it is required to set credentials of your local WiFi network and reflash the AI-deck to be able to connect and streaming the images, but we are working on turning the Nina into an access-point instead so no reflashing would be required. We hope that we will be able to implement this before the release.

Top view of the AI deck

We are also trying out to adjust applications to make suitable of the AI deck. For instance, we have adapted Greenwaves’ face-detector example to use the image streamer instead of the display available on the GAPuino boards. You can see a video of the result here underneath. Beware that this face-detector is not based on a CNN but on HOG descriptors, so it only works in good conditions where the face is well lit. However, it is possible to train a CNN to detect faces in Tensorflow and flash this on the AI deck with the GAPflow framework as developed by Greenwaves. At Bitcraze we haven’t managed to try that out ourselves ( we are close to that though!) but at least this example is a nice demonstration of the AI deck’s abilities together with the WiFi-streamer. This example and more testing code can be found in our experimental repo here. For examples of GAPflow, please check out the examples/NNtool section of the GAP8 SDK.

For some reason WordPress has difficulty embedding the video that was supposed to be here, so please check https://youtu.be/0sHh2V6Cq-Q

Seeing how the development has been progressing, we will be comfortable to say that the AI deck could be ready for early release somewhere in the next month, so please keep an eye out on our website! We will continue to test the GAP SDK’s stability and we are very thankful for Greenwaves Technologies with their help so far. We will also work on getting-started guides in order to get acquainted with the AI deck, supplementing the already existing documentation about the GAP8 chip.

Even-though the AI deck will soon be ready for early release, this piece of hardware is not for the faint-hearted and embedded programming experience is a must. But keep in mind that the possibilities with the AI deck are huge, as it will be mean that super-edge-computing on a 30 gram flying platform will be available for anyone. It will all be worth it when you have your Crazyflie flying autonomously while being able to recognize its surroundings :)

In this blog-post we wanted to give you guys an overview of our running projects and a general update of the status of things! We got settled in our home-labs and are working on many projects in parallel. There are a lot of development happening at the moment, but the general feeling is that we do miss working with each other at our office! With our daily slack Bitcraze sync meetings and virtual fikapause (Swedish for coffee breaks), we try to substitute what we can. In the mean time, we are going on a roll with finishing all our goals we have set at our latest quarterly meeting, so here you can read about those developments.

AI-deck

Crazyflie with AI-deck

The last time we gave an update about the AI-deck was in this blog post and in the final post of our intern Zhouxin. Building on his work, we are now refocusing on getting the AI-deck ready for early release. The last hurdle is mostly software wise on which we are considering several approaches together with the manufacturer of the Gap8 chip Greenwaves technologies. Currently we are preparing small testing functions as examples of the different elements of the AI-deck in our repo, which are all still in a very primarily phase.

Even though we still need some time to finalize the AI-deck’s early release, we will consider sending an early version of the AI-deck if you are willing to provide feedback while working with it. Please fill in the form and we will get back to you.

Lighthouse

We have made quite some progress on the development for the lighthouse V2. Kristoffer has been working hard from his homelab to get a seamless integration of both V1 and V2 in our firmware (check out this github issue for updates). Currently it is still very untested and very much in progress, however we do have a little preview for you to enjoy.

Crazyflie with LH basestation v2

Documentation

Right now, we are also doing a lot of revamping of the large web of documentation. Unfortunately this is a lot of work! As you noticed by now, we have added overview pages to guide the reader to the right information. We also have moved the tutorials to another part of the menu to avoid clutter on our website. In general we try to go through the repository docs to see if there is any information missing or outdated, however please let us know if you have encountered an error in any description or are missing crucial elements.

Our latest task is revamping the product pages as well, by putting all the necessary information about the hardware in just one place. Also, we are planning to make (video) tutorials soon about many elements of the Crazyflie and how to work with it. More about that later!

Production and Shipment

Production at our manufacturers in China are slowly starting up again. Although it is not yet back at full force, it does enable us to already start ordering to replenish our stock and to get started with finishing our test rigs. Moreover, we are also negotiating to resolve the propeller issue we mentioned earlier, but there is no update on that so far.

As mentioned in this blogpost, we are still shipping orders about twice a week. Both DHL and Fedex are functioning as normal, but we do notice that there is a delay of a few extra days on some deliveries. Please keep that in mind when ordering at our webshop.

We have mentioned the Active Marker deck in an earlier blog post, and are now happy to announce that it has been released and is available in our store.

Crazyflie with Active marker deck

By changing the passive, reflective markers to active, IR-LEDs, it is possible to improve the detection of markers in the cameras. There are two main reasons: the area of the marker is smaller and easier to separate from other markers close by, and the LEDs are emitting light and can be detected further away.

The deck has been developed in collaboration with Qualisys and together with the QTM system, it utilizes their Active Marker technology. An ID is assigned to each marker, and since the identity of each marker can be detected by the MoCap system, it is possible to estimate the full body pose of the Crazyflie without unique marker positions or known starting positions. IDs are easily assigned using the parameter sub system of the Crazyflie.

Even though the deck mainly is intended to be used with Qualisys MoCap systems, the LED markers can also be configured to be on or off which we hope might be useful in other applications as well.

This is it. The end of my internship. It feels strange to leave this unique office in a place called Malmö. My time spent here was more than just doing an assignment as part of a MSc. degree with the objective that I would gain working experience and contribute to a company.

My last day at the office of Bitcraze, Arnaud was already on parental leave

My time here gave me so much more. I have learned here a healthy way of thinking and problem solving which is part of the unique Bitcraze company culture. Next to that, it felt more like working with friends than just working with colleagues. Going to the office is a delight, as there is always humor, openness and honesty. I got to know everyone and enjoy the French, Swedish and Dutch-American hospitality and culture.

At this point you might think that I only have been drinking coffee and made sure that coffee in the office was not below level. Luckily that was not the case. I had the privilege to be the first user for a new deck. This deck has been in development for quite some time now and has been glossed over in some earlier blog posts. It is the yet to-be-released AI-Deck! At the moment the early-access AI-Decks are a delayed due to the COVID-19 virus. Bitcraze will update you on the blog when they know more. 

My task within Bitcraze, in more detail, was to improve user friendliness of the AI-Deck by providing a framework for future users and at the same time to explore user friendliness of the whole ecosystem around the AI-Deck for an engineering student with beginner experience in embedded programming (e.g. me).

At the verge of giving the Crazyflie some AI capabilities, while being micromanaged.

So my mission began. A logical step was to see if the convolutional neural network from the PULP-DroNet project would run on the AI-Deck and fly with the Crazyflie, as the AI-Deck is an evolution of the PULP-Shield developed for this project. More information about this can be found here.

Unfortunately, this was not an easy feat as the PULP-DroNet project is using the pure version of the PULP SDK and an outdated autotiler. While the development partner for the AI-Deck, Greenwaves Technologies, uses the PULP SDK as a base with added functionalities in their SDK, which made it divert from the SDK used in the PULP-DroNet project. 

Though, I was able to run the convolutional neural network in a simulated environment and compare this to the original DroNet that was implemented using Python and a Bebop. It was interesting to find out that the convolutional neural network of PULP-DroNet was behaving differently than the original DroNet in Python. There can be many explanations for this, but the main hypothesis is that this is caused by quantizing the network of PULP-DroNet from 32-bit floating point to 16-bit fixed point. In addition, the aforementioned network is trained on a larger dataset which included data created by a Himax camera.

A single Crazyflie obtained self-awareness and spun up a swarm of Crazyflies to gain world domination

While porting PULP-DroNet to the AI-Deck should be possible, the obstacles found along the way made it too troublesome and out of scope for my internship. So I moved on with the main objective, making a framework/example for the AI-Deck using the SDK provided by Greenwaves Technologies, which is called the GAP8 SDK. It contains a set of tools that should make the use of the AI-Deck easier, namely the NNTool and Autotiler tool. These tools make sure that you can automate the conversion of your neural network that is designed and trained in Python (Tensorflow and Keras) to a neural network code that can utilize the GAP8 functionalities.

My internship came to an end before I could overcome the last hurdle for a working example. To still bring this example to you, I have committed the doc/code I wrote and handed over the knowledge that I have accumulated throughout my internship when working with the AI-Deck and its environment to the capable minds of Kimberly and Tobias.

Along the way I have learned a lot about embedded programming and being a first product user. In addition with embedded programming and programming in general comes a different mindset than a conventional planning and deadline fixed mindset you get from university. With these valuable lessons in mind, I will be heading back to the TU Delft to start with my master thesis in either reinforcement learning for aircrafts or dense optical flow nets for quadcopters. Thank you Bitcraze for your time, experience and hospitality!

 

I started working with the Crazyflie 2.0 in 2015. I was interested in learning how to program a quadcopter, and the open-source nature of the Crazyflie’s hardware and software was the perfect starting point.

Shortly after, I discovered the world of FPV and the thrill of flying with a bird’s eye view. My journey progressed from rubber-banding an all-in-one camera/VTX to my Crazyflie, to building a 250mm racing quad (via the BigQuad deck), and into the world of Betaflight (including bringing Betaflight support to the Crazyflie hardware).

 

Naturally, the announcement of the Bolt (then known as the RZR) piqued my interest, and the folks at Bitcraze graciously allowed me early hands-on with the product.

This post details my progress towards building out a FPV-style drone on top of the Crazyflie Bolt.

Component List

The FPV community has come a long way since 2015. What once was a very complicated process is now well documented and similar to building a PC (well, with some soldering). For latest details on the specifics of building FPV drones, I recommend resources such as Joshua Bardwell or the r/Multicopter subreddit.

Turns out I had enough components lying around for a 4-inch (propeller diameter) build based on 3S (3 cell) LiPo batteries. Again, there’s nothing special about these parts (in fact they’re all out of date). Take this list as a guide, and do your own research.

  • PDB (Power Distribution Board): This is a circuit board that produces regulated voltages from an unregulated LiPo battery. The Bolt has built-in regulators but is only rated up to an 8A current draw per motor. My 4 inch propellers will certainly draw more than 8A, and so an external PDB is required (plus having dedicated 12V and 5V supplies is nice for peripherals).
  • 4x DYS 1806 Brushless Motors: Brushless motors use magnetic pulses to rotate a motor bell (distinct from brushed motors found on the regular Crazyflie).
  • 4x DYS 20A BLHeli_S ESCs (Electronic Speed Controller): This is a piece of circuitry that accepts a logic-level control signal and applies direct battery power to motor coils to make a brushless motor spin. They have to be rated for the current draw expected by the battery+propeller combination.
  • Tweaker (by Shendrones) Frame: I’ve been wanting to build a quad around this frame, and the large square hole is interesting for the Bolt (more on that later). One thing to keep in mind is this is an ‘H’ style frame. That is, it’s longer than it is wide, so flight will not be perfectly symmetrical. If you’re interested in building a larger Crazyflie and not so interested in FPV, you’ll definitely want a symmetrical ‘X’ style frame.
  • WS2812B addressable LEDs: LEDs are proven to make things better. It’s science.
  • Camera + VTX: For a full FPV setup, you’ll need a camera and a video transmitter. For the most part these run completely independently of the flight controller and so I’ll omit them from this article — what I’ve shown in the picture above is horribly out of date anyway.
  • RX: Radio receiver. For longer range flights and reduced latency it may be a good idea to use an external radio and UART-based receiver with diversity antennas. However, some specific work went in to the Bolt’s antenna design, so I’ll be sticking with the on-board NRF51 and external antenna.
  • Flight Controller: The Crazyflie Bolt!

The Build

Again, there are hundreds of fantastic guides on the web that detail how to build an FPV quadcopter. Instead of trying to create another, here are some notes specific to my Bolt build.

Expansion Decks

Since the Bolt is pin compatible with the Crazyflie, I thought it would be interesting to try and take advantage of a couple existing Crazyflie expansion decks in my build: The LED Ring Deck, the Flow Deck v2, and the Micro SD Card Deck.

The LED Ring Deck

The LEDs were the most hands-on feature to enable. Rather than simply attaching the LED ring inside the frame, I mounted a series of WS2812B lights to the underside of my frame’s arms. The LED Ring Deck consists of 12 LEDs connected in series — so I put three LEDs on each arm of the frame and wired them up in a daisy-chain.

Finally, I soldered the lead to IO_2 (the same that’s used by the LED Ring Deck) on a Breakout Deck.

Since this isn’t the official LED Ring Deck, there’s no OW memory ID. The deck must be force-enabled by specifying a compile flag in your tools/build/make/config.mk file:

CFLAGS += -DDECK_FORCE=bcLedRing

With the custom firmware, the under-arm LEDs work just like the LED Ring Deck (other than the lack of front-facing LEDs).

Micro SD Card Deck

Most popular flight controllers feature flash storage or SD card slots for data logging. The FPV community uses storage to log sensor data for PID tuning and debugging. Naturally, this deck is a good fit on my Bolt build, and requires no additional modification.

Flow Deck (v2)

Remember my interest in the square cutout on my frame of choice? That, and my unorthodox choice to mount the Bolt board below my PDB, means I can theoretically use the bottom-attached Flow Deck to achieve some lateral stabilization while close to the ground. In theory, the VL53L1x ranger should work outdoors thanks to its usage of 940nm light as opposed to 850nm.

Note: This photo also shows the daisy chain wire connecting banks of LEDs in series

Other Build Tips

  • It’s good practice to soft mount flight controllers to minimize transferring motor/prop vibrations into the IMU. I used these to isolate the flight controller from the frame — not perfect, but better than a rigid mount.
  • The receiver antenna must be mounted clear of the carbon fiber frame and electronics. I like to use a heavy duty zip tie and attach the antenna with heat shrink.
  • The Bolt can be powered from a 5v regulator on your PDB, but if you want to take advantage of the VBat sensor it should be powered from the raw battery leads instead. However, most ESCs support active breaking (ability to slow/stop the propellers on demand). Active breaking is known to produce a lot of back-voltage, which can damage some circuits. To be safe, since I’m using a 3S battery (12.6V when fully charged, 11.1V when depleted) I chose to power the Bolt off a regulated 12V supply from my PDB. This way, the PDB’s regulator will filter out voltage spikes and help protect the Bolt. Readings won’t be accurate at the higher range, but what really matters for a voltage sensor is to know when to land.

Results

It works! There is work needed to improve flight, though:

  • Control tuning is required. The powerful brushless motors respond much quicker than brushed motors, and so many of the PID and/or Kalman parameters are too aggressive or just non-optimal.
  • Stabilization with the Flow deck does not work — I haven’t spent much time debugging but my guess is it’s either due to the Kalman tuning, or problems with the VL53L1x depth working outdoors (which also impacts the flow measurements)
  • Betaflight Support: Betaflight has no driver for the BMI088 IMU used on the Crazyflie Bolt or the Crazyflie 2.1.
  • Safety Features: Brushless quads are very dangerous and can cause serious injuries. It’d be good to implement a kill-switch and a more aggressive failsafe in the firmware to prevent flyaways.

All in all, this was an enjoyable project and I’m excited to see some autonomous brushed quads coming out of the Crazyflie community!

We are currently finishing production test design for a couple of expansion decks and we figured we never wrote about it and about the more general board production process. In this blog post we wanted to talk a bit about how we test boards in the productions phase, taking as an example the forthcoming active marker deck.

The active marker deck

When finalizing an electronic board, we send to the manufacturer documentation that allows to manufacture & assemble a, hopefully, functional board. Although we assume that the individual components are in working order, the problem is that the assembling is not always perfect, so we need to check that everything we do is actually working,. This is what the production test is solving.

The first thing is to find out what to test, for that we need a strategy. The strategy we have been using is to test every step where we have modified or work on: for example we will test all the connections we have soldered in the manufacturing process. We will normally not test all the functionalities of ready-made module. For example, following this strategy, we will usually test all communication interface we have cabled, but we will not test all functionalities of a microcontroller we solder on the board, these are deemed to be already tested and working by the microcontroller manufacturer. This step usually end up with an annotated schematic:

Annoted schematics of ActiveMarker Deck

Once we know what to test and roughly how to test it, we document a test rig that will be able to run the tests automatically. Some tests are generic and applicable to all our boards, for example we do test voltages with a multi-meter on every board that has a regulator. Some tests are very board specific. For example, for the active marker deck we want to test IR LEDs and an IR detector, we define a test rig that has reflector to reflect the LED to the detector and we will use the onboard detector to test the LEDs:

Simple block diagram of the test rig for the ActiveMarker Deck

We are normally using a Crazyflie on all our test rig, since it is usually possible to test all functionality from the deck port. We also try as much as possible to integrate the test software into the real software. For the active marker deck it meant adding 38KHz modulated output mode to the LEDs in order to emit a signal detectable by the detector, which will make it to the final firmware. Finally, we have a test software, running on the test computer, that uses the Crazyflie python lib to talk to the Crazyflie and run the test. The last step of all the test is to write the deck One Wire identification memory so that it can be detected by a Crazyflie.

Screenshot of the test program for the test engineer

From these specification, the manufacturer can then build a test rig and start testing boards, non-passing board will be re-worked until they pass or discarded.

Test rig for the Multi-ranger expansion deck

What we have learned in our years at Bitcraze is that testing phase is the most important part of the development process of PCB. Therefore, the earliest we already start thinking about the production tests in the board design, the more smooth the final phase of production of our new products will be.

After a couple of delays we are happy to announce the Crazyflie Bolt is now stocked and ready to ship out. For those of you that are new to the Bolt, it is basically a Crazyflie 2.1 control board, but built to fit a bigger package. We have blogged about it a couple of times before, so if you would like to catch up you can start from the first idea, to maturing and finally changing name from RZR to Bolt. Another way to describe the Bolt is: Crazyflie 2.1 + Big-quad deck in one which doesn’t hog any deck expansion pins. Thus combinations such as Bolt + Led-ring + Lighthouse-4 is now possible or e.g. Bolt + Flow v2 + LPS.

Keep in mind that the Bolt is an early access product so you will most likely have to dig in to the code to hard-code PID-tuning parameters etc. Also trowing a warning finger, heavier drones can be very dangerous so be sure to keep safe!

The Crazyflie Bolt is delivered as a stand alone control board. Frame, motors, propellers and battery needs to be added, for details check out the wiki. Unfortunately we don’t have a good reference kit to recommend at the moment. If you happen to have built a good one, please share.

As pointed out in Daniele’s blog post about the PULP-DroNet we are collaborating on a AI-deck built around the new GAP8 RISC-V multi-core MCU. In the blog post you can find all the details around DroNet while here we will talk a bit about the AI-deck hardware. The AI-deck is similar to the PULP-Shield but with some optimizations. One of the HyperFlash memory spots has been removed, the communication interface slimmed down and a ESP32 (NINA module) has been added for WiFi connectivity.

Latest AI-deck prototype

So all together this a pretty good platform to develop low power AI on the edge for a drone.

Features:

  • GAP8 – Ultra low power 9 core RISC-V MCU
  • Himax HM01B0 – Ultra low power 320×320 greyscale camera.
  • 512 Mbit HyperFlash and 64 Mbit HyperRAM
  • ESP32 for WiFi and more (NINA-W102)
  • 2 x JTAG for GAP8 and ESP32

Currently we are doing the final testing of the hardware and hopefully we will launch production in the end of October. If production goes according to plan we hope we can offer it as an early access product just before X-mas. Make sure to come back and check the blog for more information about the progress as well as pricing.