Classification Demo

Image classification on the AI-deck

In this example, a binary classification CNN (object or background) is trained and executed on the AI-deck. The models included in the example are trained on data from a very particular domain (a table in the Bitcraze arena, with a particular Christmas package) with limited data augmentation, resulting in poor generalization and low robustness. For good results on your own domain, the example can be trained on a custom dataset captured by the AI-deck camera, and fitted to detect multiple custom classes of your choosing. The training can currently only be done on a native installation. Execution on / flashing the AI-deck can be done with a native installation or with the GAP8 docker.

classification

The used CNN is a pre-trained MobileNetV2 (alpha = 0.35, image input size=96x96x3) with a custom classification head and prepended convolutional layers to accept the grayscale/Bayer camera stream of the AI-deck and resize this to MobileNet’s expected input shape. The example was tested on an AI-deck v1.1 with a Bayer color camera equipped.

CNN fine-tuning demo based on TensorFlow “retrain classification” demo.

For more information on good deep learning practices, we recommend reviewing Deep Learning by Goodfellow, Bengio and Courville, available online for free.

File structure

root(classification) contains everything required to (re)train a custom MobileNetV2, and export to a TensorFlow Lite model. It also contains the project to deploy the generated TensorFlow Lite model on the AI-deck.

model/ contains the generated TensorFlow Lite models, and nntool scripts.

training_data/ should contain all training/validation images, the TensorFlow training pipeline expects a file structure as included, with separate folders for each class in both a train and validation directory.

samples/ should contains sample images used by nntool to quantize the imported models (optional, only required if MODEL_PREQUANTIZED = false in Makefile). These can be taken from the dataset.


Prerequisites

Build Environments

Tensor Flow

  • A seperate Python environment with Python requirements installed. We recommend using a virtual environment manager such as Miniconda or Penv. Use python version 3.10 and install requirements using pip: pip install -r requirements.txt

GAP8

There are two approaches for this:

  • Install gap_sdk natively on your machine (>= 4.8.0)
  • Build inside Docker container with gap_sdk >= 4.8.0

We will show the instructions for building the example the docker container later.


Generate a custom dataset

Collect images from the AI-deck using the WiFi streamer example with the opencv-viewer script (use the –save flag). Place them in the training_data folder, according to the instructions inside. The captured data must be split into a train and validation set by hand (a good starting point is a 75% train - 25 % validation split). The existing classes can be renamed as desired. For more than two classes, increase the number of units in the final (dense) layer of the model.

This is the folder structure you should follow: Put here the training and validation images like this:

/train/class_1/*.jpeg
/train/class_2/*.jpeg
/validation/class_1/*.jpeg
/validation/class_2/*.jpeg

Fine-tune a pre-trained image classification CNN

From GAP8/ai_examples/classification/ run python train_classifier.py [--args].

For possible arguments, review the parse_args() function in main.py.

Automatically generates quantized and non-quantized TensorFlow Lite models.


Execute the image classification CNN on the AI-deck

From a terminal with the docker container, or gap_sdk dev environment, in examples/ai/classifier example folder, execute:

$ docker run --rm -v ${PWD}:/module aideck-with-autotiler tools/build/make-example examples/ai/classification clean model build image

Then from another terminal (outside of the container), use the cfloader to flash the example if you have the gap8 bootloader flashed AIdeck. Change the [CRAZYFLIE URI] with your crazyflie URI like radio://0/40/2M/E7E7E7E703

cfloader flash examples/ai/classification/BUILD/GAP8_V2/GCC_RISCV_FREERTOS/target.board.devices.flash.img deck-bcAI:gap8-fw -w [CRAZYFLIE URI]

When the example is flashing, you should see the GAP8 LED blink fast, which is the bootloader. The example itself can be noticed by a slow blinking LED.

Note: There are still some issues with classification example with the new CPX framework. Please check the status of the ticket open for this.