Motion Recognition with Edge Impulse

post hero image

Introduction

The deep learning workflow involves the following tasks:

  1. Set a goal
  2. Collect a dataset
  3. Design a model architecture
  4. Train the model
  5. Convert the model to be used on the device
  6. Run inference
  7. Evaluate and troubleshoot

Let’s see how Edge Impulse platform can help you to go through all those steps without having to write too much code.

Edge Impulse

Edge Impulse is a powerful development platform for machine learning on edge devices. It allows the collection of many types of data (audio, video and countless sensors), impulse design with pre-built data processing and ML blocks, testing and deploying on a huge ecosystem of hardware market leaders like Arduino, Raspberry Pi, Nvidia and many others.

First of all, login or signup to Edge Impulse portal. Then, navigate to Select Project page and click the green button +Create new project.

I suggest you a project name similar to motion-detection-arduino-rp-2040.

RP2040 microcontroller

Raspberry Pi’s RP2040 microcontroller unit has full support from Edge Impulse platform. It can be programmed through the C/C++ SDK or using the Python flavour MicroPython.

The most known boards powered by this microcontroller are the Raspberry Pi Pico and the Arduino Nano RP2040 Connect, which will the used as deployment target for this project.

Supported devices

If you don’t own a board provided with PR2040 chip, don’t worry: you can use a device equipped with a microphone (even your computer or your smartphone).

The most common supported devices for this project are:

Go to Devices tab and connect at least a device from the list above. The connected device will be referred as input device from now on in the blog.

In the picture below you can see how I connected an Arduino and… the same Arduino with data forwarder tool!

Devices tab

Objective

The objective is using Edge Impulse to design, train, test and deploy a machine learning algorithm that recognize the type of movement made by the board. The algorithm will run on an edge device: a tiny form factor with limited resources.

Movements can be easy, like left & right, up & down, or more complex like walk, run, fall. Try to experiment with the most strange motions you can think about!

Data Acquisition

First of all, decide the motions that the system will learn to recognize:

  1. idle (meant as still)
  2. up & down
  3. left & right
  4. forward & downward

Movements Types

Data forwarder

The data forwarder is used to transmit data from any device to Edge Impulse over serial. Devices write sensor values over a serial connection and the data forwarder collects, signs and sends the data to the ingestion service.

The forwarder is useful to enable data collection from a wide range of development boards without having to port the full remote management protocol and the serial protocol.

Arduino_LSM6DSOX integration

Arduino Nano RP2040 Connect control its onboard IMU with Arduino_LSM6DSOX library, which has a sample rate frequency of 104 Hz (100 Hz will be fine too). Please remind that depending on the board and the library, you would need to set a different frequency and baud rate.

Upload nano-rp2040-data-forwarder.ino sketch on your board, then open a terminal and type:

edge-impulse-data-forwarder --frequency 104 --baud-rate 115200

The result will be similar to the image below.

Data forwarder from terminal

Build the dataset

It’s time to move!

To collect your first data, go to Data acquisition tab. If you use the data forwarder tool as I do, Record new data section will the filled automatically.

Change Label according to the movement you want to record, click Start sampling button and start moving the board. Let’s collect about 3 to 10 minutes for each movement.

Record new data

Rebalance the dataset

As I’ll explain in a more detailed post, collected data from all classes must be split into Train and Test datasets. Training dataset will train the machine learning model: it holds the 80% of the actual data.
Test dataset will validate the model with the remaining 20% of the data.

You can let Edge Impulse perform this subdivision for you. In Dashboard tab, scroll down to Danger zone. Click Perform train / test split: inside the dialog click Yes, perform train / test split button.

Perform train / test split

Impulse design

An impulse takes the raw data, slices it up, uses signal processing blocks to extract features and then uses a learning block to classify new data. You’ll need a Spectral analysis signal processing block, which will:

  1. apply a filter
  2. perform spectral analysis on the signal
  3. extracts frequency and spectral power data

Then a NN Classifier learning block will take in spectral features and learn to distinguish between the classes.

Go to Impulse design tab and add the following blocks:

  1. Time series data
  2. Spectral analysis
  3. Classification (Keras)

Leave the window size to 1 second since it’s the length of samples in the dataset.

Click Save Impulse button on the right.

Impulse Design

Spectral analysis

Click the Generate features tab on the top of the page, then click Generate features button.

Spectral Feature Explorer

NN Classifier

Neural networks are algorithms that emulates the human brain behavior. They learn to recognize patterns that appear in the training data.

Click on NN Classifier sub-tab of Impulse design tab: NN Classifier input slice

Training

Let’s click Start training button. When it’s complete, take a look at Model slice.

NN Classifier Model validation

Last training performance panel displays the results of validation, which is a 20% of the training data that has been set aside to evaluate how the model is performing. The accuracy refers to the percentage of windows that were correctly classified, while loss is the difference between the actual output and the predicted output.

Confusion matrix shows the balance of correctly versus incorrectly classified windows.

Data explorer shows all the training set classified by the neural network.

On-device performance shows stats about how the model is likely to run on an edge device:

  • Inferencing Time: estimate of how long the model will take to analyze 1 second of data
  • Peak RAM usage: estimation of RAM required to run the model on-device
  • Flash usage: estimation of flash memory required to run the model on-device

NN Classifier On-device performance

Retrain the data

If you perform any changes to the dataset, you must retrain the model. In Retrain model tab click Train model button.

Testing

It’s very important to test the model on unseen data: that’s why you had to perform the split in Rebalance the dataset blog slice. This will ensure the model has not learned to overfit the training data. To run your model against the test set, click Model testing tab on the left menu. On the right of Test data title, click Classify all button.

To analyze a misclassified sample, click the three dots next to a sample and select Show classification, which will move you to Live classification tab where you inspect the sample and compare it to your training data. You can either update the label or move the item to the training set to refine the model.

Model testing output

Live Classification

With your input device connected, click on Live Classification tab. If the Arduino Nano disconnects, open the Arduino IDE’s Serial Monitor for a moment: when IMU values start flows, close it and open a Terminal window.

edge-impulse-data-forwarder

Classify new data

Click Start sampling and move your input device in different directions. The result will show the movement type classified for each second. When finished, if you press on any row of the Detailed result table, it will highlight the corresponding slice of raw data and Spectral feature.

Live classification result

Deploy the model

Move to Deployment tab, where you can choose the destination target of your model.

You can create a library in C++, Arduino and many others. You can also build a firmware for Linux boards, many Nordic and Arduino boards, RP2040 and tons of other boards.

Create Arduino library

Create Arduino library

Click on Arduino library, scroll down the page and click Build button. Note that you can optionally enable EON™ Compiler.

Building process will end up downloading an archive called about so: ei-motion-detection-arduino.zip. Open Arduino IDE and click on Sketch, Include library and Add .ZIP Library... then select the .zip archive.

You can find my own examples in motion-recognition subdirectory of arduino-projects GitLab repository.

To allow compiling on Arduino Nano family boards, click on Tools, Board and Boards Manager... and install Arduino Mbed OS Nano Boards package.

Nano RP2040 Connect ROM Bootloader mode

Since Arduino Nano RP2040 Connect upload procedure relies on the Raspberry’s bootloader, it could be seen as a mass storage device from your computer. If your machine is fast enough during a sketch upload, it can notify you about a USB device being plugged.

When a sketch is uploaded successfully, the mass storage of the board may be visible to the OS. When this occurs, you can force the ROM bootloader mode, which will enable mass storage, allowing you to upload UF2 images like CircuitPython, MicroPython, a regular Arduino sketch or an Edge Impulse firmware.

If the board is not detected even when is connected to your computer. You can solve through the following steps:

  • Connect jumper wire between GND and REC pins
  • Press Reset button
  • Unplug and plug back the UBS cable
  • Upload the Arduino sketch

A factory-reset can be performed by dragging the blink.ino.elf.uf2 file into the mass storage (wait for the mass storage to automatically unmount).

Post Install script

If you can’t manage to get the Arduino Nano RP2040 Connect working, try to run this command:

sudo ~/.arduino15/packages/arduino/hardware/mbed_nano/4.0.2/post-install.sh

By the time you’ll read this article, the version 4.0.2 could have changed. If the version becomes X.Y.Z, you would need to run the command:

sudo ~/.arduino15/packages/arduino/hardware/mbed_nano/X.Y.Z/post-install.sh

Build RP2040 firmware

Build RP2040 firmware

Click on Raspberry Pi RP2040, scroll down the page and click Build button.

Building process will end up downloading a .zip archive containing a UF2 file called ei_rp2040_firmware.uf2, which you should extract in the directory you prefer.

The steps are similar to Arduino sketch uploading:

  • Connect jumper wire between GND and REC pins
  • Press Reset button
  • Unplug and plug back the UBS cable
  • Drag and drop ei_rp2040_firmware.uf2 file into the mass storage
  • Wait for the mass storage to automatically unmount

Lastly, open a Terminal window and run the Edge Impulse daemon:

edge-impulse-run-impulse

Note that you can run the daemon if you have installed Edge Impulse locally on your computer following one of those tutorials:

Conclusion

Wow, here we are. It was a very long journey!

I can’t wait to learn and experiment with new sensors, boards, impulse blocks and make my ideas come true.

Documentation

Some useful links: