• Stars
    star
    141
  • Rank 259,971 (Top 6 %)
  • Language
    Jupyter Notebook
  • Created over 2 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Embedded Machine Learning Courseware

Markdown link check status badge Markdown linter status badge Spellcheck status badge HitCount

Welcome to the Edge Impulse open courseware for embedded machine learning! This repository houses a collection of slides, reading material, project prompts, and sample questions to get you started creating your own embedded machine learning course. You will also have access to videos that cover much of the material. You are welcome to share these videos with your class either in the classroom or let students watch them on their own time.

This repository is part of the Edge Impulse University Program. Please see this page for more information on how to join: edgeimpulse.com/university.

How to Use This Repository

Please note that the content in this repository is not intended to be a full semester-long course. Rather, you are encouraged to pull from the modules, rearrange the ordering, make modifications, and use as you see fit to integrate the content into your own curriculum.

For example, many of the lectures and examples from the TinyML Courseware (given by [3]) go into detail about how TensorFlow Lite works along with advanced topics like quantization. Feel free to skip those sections if you would just like an overview of embedded machine learning and how to use it with Edge Impulse.

In general, content from [3] cover theory and hands-on Python coding with Jupyter Notebooks to demonstrate these concepts. Content from [1] and [2] cover hands-on demonstrations and projects using Edge Impulse to deploy machine learning models to embedded systems.

Content is divided into separate modules. Each module is assumed to be about a week's worth of material, and each section within a module contains about 60 minutes of presentation material. Modules also contain example quiz/test questions, practice problems, and hands-on assignments.

If you would like to see more content than what is available in this repository, please refer to the Harvard TinyMLedu site for additional course material.

License

Unless otherwise noted, slides, sample questions, and project prompts are released under the Creative Commons Attribution NonCommercial ShareAlike 4.0 International (CC BY-NC-SA 4.0) license. You are welcome to use and modify them for educational purposes.

The YouTube videos in this repository are shared via the standard YouTube license. You are allowed to show them to your class or provide links for students (and others) to watch.

Professional Development

Much of the material found in this repository is curated from a collection of online courses with permission from the original creators. You are welcome to take the courses (as professional development) to learn the material in a guided fashion or refer students to the courses for additional learning opportunities.

Prerequisites

Students should be familiar with the following topics to complete the example questions and hands-on assignments:

  • Algebra
    • Solving linear equations
  • Probability and Statistics
    • Expressing probabilities of independent events
    • Normal distributions
    • Mean and median
  • Programming
    • Arduino/C++ programming (conditionals, loops, arrays/buffers, pointers, functions)
    • Python programming (conditionals, loops, arrays, functions, NumPy)

Optional prerequisites: many machine learning concepts can be quite advanced. While these advanced topics are briefly discussed in the slides and videos, they are not required for quiz questions and hands-on projects. If you would like to dig deeper into such concepts in your course, students may need to be familiar with the following:

  • Linear algebra
    • Matrix addition, subtraction, and multiplication
    • Dot product
    • Matrix transposition and inversion
  • Calculus
    • The derivative and chain rule are important for backpropagation (a part of model training)
    • Integrals and summation are used to find the area under a curve (AUC) for some model evaluations
  • Digital signal processing (DSP)
    • Sampling rate
    • Nyquist–Shannon sampling theorem
    • Fourier transform and fast Fourier transform (FFT)
    • Spectrogram
  • Machine learning
    • Logistic regression
    • Neural networks
    • Backpropagation
    • Gradient descent
    • Softmax function
    • K-means clustering
  • Programming
    • C++ programming (objects, callback functions)
    • Microcontrollers (hardware interrupts, direct memory access, double buffering, real-time operating systems)

Feedback and Contributing

If you find errors or have suggestions about how to make this material better, please let us know! You may create an issue describing your feedback or create a pull request if you are familiar with Git.

This repo uses automatic link checking and spell checking. If continuous integration (CI) fails after a push, you may find the dead links or misspelled words, fix them, and push again to re-trigger CI. If dead links or misspelled words are false positives (i.e. purposely malformed link or proper noun), you may update .mlc_config.json for links to ignore or .wordlist.txt for words to ignore.

Required Hardware and Software

Students will need a computer and Internet access to perform machine learning model training and hands-on exercises with the Edge Impulse Studio and Google Colab. Students are encouraged to use the Arduino Tiny Machine Learning kit to practice performing inference on an embedded device.

A Google account is required for Google Colab.

An Edge Impulse is required for the Edge Impulse Studio.

Students will need to install the latest Arduino IDE.

Preexisting Datasets and Projects

This is a collection of preexisting datasets, Edge Impulse projects, or curation tools to help you get started with your own edge machine learning projects. With a public Edge Impulse project, note that you can clone the project to your account and/or download the dataset from the Dashboard.

Motion

Sound

Image Classification

Object Detection

Syllabus

Course Material

Module 1: Machine Learning on the Edge

This module provides an overview of machine learning and how it can be used to solve problems. It also introduces the idea of running machine learning algorithms on resource-constrained devices, such as microcontrollers. It covers some of the limitations and ethical concerns of machine learning. Finally, it demonstrates a few examples of Python in Google Colab, which will be used in early modules to showcase how programming is often performed for machine learning with TensorFlow and Keras.

Learning Objectives

  1. Describe the differences between artificial intelligence, machine learning, and deep learning
  2. Provide examples of how machine learning can be used to solve problems (that traditional deterministic programming cannot)
  3. Provide examples of how embedded machine learning can be used to solve problems (where other forms of machine learning would be limited or inappropriate)
  4. Describe the limitations of machine learning
  5. Describe the ethical concerns of machine learning
  6. Describe the differences between supervised and unsupervised machine learning

Section 1: Machine Learning on the Edge

Lecture Material
ID Description Links Attribution
1.1.1 What is machine learning video slides [1]
1.1.2 Machine learning on embedded devices video slides [1]
1.1.3 What is tiny machine learning slides [3]
1.1.4 Tinyml case studies doc [3]
1.1.5 How do we enable tinyml slides [3]
Exercises and Problems
ID Description Links Attribution
1.1.7 Example assessment questions doc [3]

Section 2: Limitations and Ethics

Lecture Material
ID Description Links Attribution
1.2.1 Limitations and ethics video slides [1]
1.2.2 What am I building? slides [3]
1.2.3 Who am I building this for? slides [3]
1.2.4 What are the consequences? slides [3]
1.2.5 The limitations of machine learning blog
1.2.6 The future of AI; bias amplification and algorithm determinism blog
Exercises and Problems
ID Description Links Attribution
1.2.7 Example assessment questions doc [3]

Section 3: Getting Started with Colab

Lecture Material
ID Description Links Attribution
1.3.1 Getting Started with Google Colab video
1.3.2 Intro to colab slides [3]
1.3.3 Welcome to Colab! colab
1.3.4 Colab tips doc [3]
1.3.5 Why TensorFlow? video
1.3.6 Sample tensorflow code doc [3]
Exercises and Problems
ID Description Links Attribution
1.3.7 101 exercises for Python fundamentals colab

Module 2: Getting Started with Deep Learning

This module provides an overview of neural networks and how they can be used to make predictions. Simple examples are given in Python (Google Colab) for students to play with and learn from. If you do not wish to explore basic machine learning with Keras in Google Colab, you may skip this module to move on to using the Edge Impulse graphical environment. Note that some later exercises rely on Google Colab for curating data, visualizing neural networks, etc.

Learning Objectives

  1. Provide examples of how machine learning can be used to solve problems (that traditional deterministic programming cannot)
  2. Provide examples of how embedded machine learning can be used to solve problems (where other forms of machine learning would be limited or inappropriate)
  3. Describe challenges associated with running machine learning algorithms on embedded systems
  4. Describe broadly how a mathematical model can be used to generalize trends in data
  5. Explain how the training process results from minimizing a loss function
  6. Describe why datasets should be broken up into training, validation, and test sets
  7. Explain how overfitting occurs and how to identify it
  8. Demonstrate the ability to train a dense neural network using Keras and TensorFlow

Section 1: Machine Learning Paradigm

Lecture Material
ID Description Links Attribution
2.1.1 The machine learning paradigm slides [3]
2.1.2 Finding patterns doc [3]
2.1.3 Thinking about loss slides [3]
2.1.4 Minimizing loss slides [3]
2.1.5 First neural network slides [3]
2.1.6 More neural networks doc [3]
2.1.7 Neural networks in action doc [3]
Exercises and Problems
ID Description Links Attribution
2.1.8 Exploring loss colab [3]
2.1.9 Minimizing loss colab [3]
2.1.10 Linear regression colab [3]
2.1.11 Solution linear regression doc [3]
2.1.12 Example assessment questions doc [3]

Section 2: Building Blocks of Deep Learning

Lecture Material
ID Description Links Attribution
2.2.1 Introduction to neural networks slides [1]
2.2.2 Initialization and learning doc [3]
2.2.3 Understanding neurons in code slides [3]
2.2.4 Neural network in code doc [3]
2.2.5 Introduction to classification slides [3]
2.2.6 Training validation and test data slides [3]
2.2.7 Realities of coding doc [3]
Exercises and Problems
ID Description Links Attribution
2.2.8 Neurons in action colab [3]
2.2.9 Multi layer neural network colab [3]
2.2.10 Dense neural network colab [3]
2.2.11 Challenge: explore neural networks colab [3]
2.2.12 Solution: explore neural networks doc [3]
2.2.13 Example assessment questions doc [3]

Section 3: Embedded Machine Learning Challenges

Lecture Material
ID Description Links Attribution
2.3.1 Challenges for tinyml a slides [3]
2.3.2 Challenges for tinyml b slides [3]
2.3.3 Challenges for tinyml c slides [3]
2.3.4 Challenges for tinyml d slides [3]
Exercises and Problems
ID Description Links Attribution
2.3.5 Example assessment questions doc [3]

Module 3: Machine Learning Workflow

In this module, students will get an understanding of how data is collected and used to train a machine learning model. They will have the opportunity to collect their own dataset, upload it to Edge Impulse, and train a model using the graphical interface. From there, they will learn how to evaluate a model using a confusion matrix to calculate precision, recall, accuracy, and F1 scores.

Learning Objectives

  1. Provide examples of how embedded machine learning can be used to solve problems (where other forms of machine learning would be limited or inappropriate)
  2. Describe challenges associated with running machine learning algorithms on embedded systems
  3. Describe why datasets should be broken up into training, validation, and test sets
  4. Explain how overfitting occurs and how to identify it
  5. Describe broadly what happens during machine learning model training
  6. Describe the difference between model training and inference
  7. Describe why test and validation datasets are needed
  8. Evaluate a model's performance by calculating accuracy, precision, recall, and F1 scores
  9. Demonstrate the ability to train a machine learning model with a given dataset and evaluate its performance

Section 1: Machine Learning Workflow

Lecture Material
ID Description Links Attribution
3.1.1 Tinyml applications slides [3]
3.1.2 Role of sensors doc [3]
3.1.3 Machine learning lifecycle slides [3]
3.1.4 Machine learning lifecycle doc [3]
3.1.5 Machine learning workflow doc [3]
Exercises and Problems
ID Description Links Attribution
3.1.6 Example assessment questions doc [3]

Section 2: Data Collection

Lecture Material
ID Description Links Attribution
3.2.1 Introduction to data engineering doc [3]
3.2.2 What is data engineering slides [3]
3.2.3 Using existing datasets slides [3]
3.2.4 Responsible data collection slides [3]
3.2.5 Getting started with edge impulse video slides [1]
3.2.6 Data collection with edge impulse video slides [1]
Exercises and Problems
ID Description Links Attribution
3.2.7 Example assessment questions doc [1]

Section 3: Model Training and Evaluation

Lecture Material
ID Description Links Attribution
3.3.1 Feature extraction from motion data video slides [1]
3.3.2 Feature selection in Edge Impulse video tutorial [1]
3.3.3 Machine learning pipeline video slides [1]
3.3.4 Model training in edge impulse video slides [1]
3.3.5 How to evaluate a model video slides [1]
3.3.6 Underfitting and overfitting video slides [1]
Exercises and Problems
ID Description Links Attribution
3.3.7 Project: Motion detection doc [1]
3.3.8 Example assessment questions doc [1]

Module 4: Model Deployment

This module covers why quantization is important for models running on embedded systems and some of the limitations. It also shows how to use a model for inference and set an appropriate threshold to minimize false positives or false negatives, depending on the system requirements. Finally, it covers the steps to deploy a model trained on Edge Impulse to an Arduino board.

Learning Objectives

  1. Provide examples of how embedded machine learning can be used to solve problems (where other forms of machine learning would be limited or inappropriate)
  2. Describe challenges associated with running machine learning algorithms on embedded systems
  3. Describe broadly what happens during machine learning model training
  4. Describe the difference between model training and inference
  5. Demonstrate the ability to perform inference on an embedded system to solve a problem

Section 1: Quantization

Lecture Material
ID Description Links Attribution
4.1.1 Why quantization doc [3]
4.1.2 Post-training quantization slides [3]
4.1.3 Quantization-aware training slides [3]
4.1.4 TensorFlow vs TensorFlow Lite slides [3]
4.1.5 TensorFlow computational graph doc [3]
Exercises and Problems
ID Description Links Attribution
4.1.6 Post-training quantization colab [3]
4.1.7 Example assessment questions doc [3]

Section 2: Embedded Microcontrollers

Lecture Material
ID Description Links Attribution
4.2.1 Embedded systems slides [3]
4.2.2 Diversity of embedded systems doc [3]
4.2.3 Embedded computing hardware slides [3]
4.2.4 Embedded microcontrollers doc [3]
4.2.5 TinyML kit peripherals doc [3]
4.2.6 TinyML kit peripherals slides [3]
4.2.7 Arduino core, frameworks, mbedOS, and bare metal doc [3]
4.2.8 Embedded ML software slides [3]
Exercises and Problems
ID Description Links Attribution
4.2.8 Embedded ml software slides [3]
4.2.9 Example assessment questions doc [3]

Section 3: Deploying a Model to an Arduino Board

Lecture Material
ID Description Links Attribution
4.3.1 Using a model for inference video slides [1]
4.3.2 Testing inference with a smartphone video [1]
4.3.3 Deploy model to arduino video slides [1]
4.3.4 Deploy model to Arduino tutorial
Exercises and Problems
ID Description Links Attribution
4.3.5 Example assessment questions doc [1]

Module 5: Anomaly Detection

This module describes several approaches to anomaly detection and why we might want to use it in embedded machine learning.

Learning Objectives

  1. Provide examples of how embedded machine learning can be used to solve problems (where other forms of machine learning would be limited or inappropriate)
  2. Describe challenges associated with running machine learning algorithms on embedded systems
  3. Describe broadly what happens during machine learning model training
  4. Describe the difference between model training and inference
  5. Demonstrate the ability to perform inference on an embedded system to solve a problem
  6. Describe how anomaly detection can be used to solve problems

Section 1: Introduction to Anomaly Detection

Lecture Material
ID Description Links Attribution
5.1.1 Introduction to anomaly detection doc [3]
5.1.2 What is anomaly detection? slides [3]
5.1.3 Challenges with anomaly detection slides [3]
5.1.4 Industry and TinyML doc [3]
5.1.5 Anomaly detection datasets slides [3]
5.1.6 MIMII dataset paper doc [3]
5.1.7 Real and synthetic data doc [3]
Exercises and Problems
ID Description Links Attribution
5.1.8 Example assessment questions doc [3]

Section 2: K-means Clustering and Autoencoders

Lecture Material
ID Description Links Attribution
5.2.1 K-means clustering slides
5.2.2 Autoencoders slides [3]
5.2.3 Autoencoder model architecture doc [3]
Exercises and Problems
ID Description Links Attribution
5.2.4 K-means clustering for anomaly detection colab [3]
5.2.5 Autoencoders for anomaly detection colab [3]
5.2.6 Challenge autoencoders colab [3]
5.2.7 Solution autoencoders doc [3]
5.2.8 Example assessment questions doc [3]

Section 3: Anomaly Detection in Edge Impulse

Lecture Material
ID Description Links Attribution
5.3.1 Anomaly detection in edge impulse video slides [1]
5.3.2 Industrial embedded machine learning demo video [1]
Exercises and Problems
ID Description Links Attribution
5.3.3 Project: Motion classification and anomaly detection doc [1]

Module 6: Image Classification with Deep Learning

This module introduces the concept of image classification, why it is important in machine learning, and how it can be used to solve problems. Convolution and pooling operations are covered, which form the building blocks for convolutional neural networks (CNNs). Saliency maps and Grad-CAM are offered as two techniques for visualizing the inner gs of CNNs. Data augmentation is introduced as a method for generating new data from existing data to train a more robust model. Finally, transfer learning is shown as a way of reusing pretrained models.

Learning Objectives

  1. Describe the differences between image classification, object detection, and image segmentation
  2. Describe how embedded computer vision can be used to solve problems
  3. Describe how convolution and pooling operations are used to filter and downsample images
  4. Describe how convolutional neural networks differ from dense neural networks and how they can be used to solve computer vision problems

Section 1: Image Classification

Lecture Material
ID Description Links Attribution
6.1.1 What is computer vision? video slides [2]
6.1.2 Overview of digital images video slides [2]
6.1.3 Dataset collection video slides [2]
6.1.4 Overview of image classification video slides [2]
6.1.5 Training an image classifier with Keras video [2]
Exercises and Problems
ID Description Links Attribution
6.1.6 Example assessment questions doc [2]

Section 2: Convolutional Neural Network (CNN)

Lecture Material
ID Description Links Attribution
6.2.1 Image convolution video slides [2]
6.2.2 Pooling layer video slides [2]
6.2.3 Convolutional neural network video slides [2]
6.2.4 CNN in keras slides [3]
6.2.5 Mapping features to labels doc [3]
6.2.6 Training a CNN in Edge Impulse video doc [2]
Exercises and Problems
ID Description Links Attribution
6.2.7 Exploring convolutions colab [3]
6.2.8 Convolutional neural networks colab [3]
6.2.9 Challenge: CNN colab [3]
6.2.10 Solution: CNN doc [3]
6.2.11 Example assessment questions doc [2]

Section 3: Analyzing CNNs, Data Augmentation, and Transfer Learning

Lecture Material
ID Description Links Attribution
6.3.1 CNN visualizations video slides [2]
6.3.2 Data augmentation video slides [2]
6.3.3 TensorFlow datasets doc [3]
6.3.4 Avoiding overfitting with data augmentation slides [3]
6.3.5 Dropout regularization doc [3]
6.3.6 Exploring loss functions and optimizers doc [3]
6.3.7 Transfer learning and MobileNet video slides [2]
6.3.8 Transfer learning with Edge Impulse video slides [2]
Exercises and Problems
ID Description Links Attribution
6.3.9 Saliency and Grad-CAM colab [2]
6.3.10 Image transforms demo colab [2]
6.3.11 Challenge: image data augmentation colab [2]
6.3.12 Solution: image data augmentation colab [2]
6.3.13 Example assessment questions doc [2]

Module 7: Object Detection and Image Segmentation

In this module, we look at object detection, how it differs from image classification, and the unique set of problems it solves. We also briefly examine image segmentation and discuss constrained object detection. Finally, we look at responsible AI as it relates to computer vision and AI at large.

Learning Objectives

  1. Describe the differences between image classification, object detection, and image segmentation
  2. Describe how embedded computer vision can be used to solve problems
  3. Describe how image segmentation can be used to solve problems
  4. Describe how convolution and pooling operations are used to filter and downsample images
  5. Describe how convolutional neural networks differ from dense neural networks and how they can be used to solve computer vision problems
  6. Describe the limitations of machine learning
  7. Describe the ethical concerns of machine learning
  8. Describe the requirements for collecting a good dataset and what factors can create a biased dataset

Section 1: Introduction to Object Detection

Lecture Material
ID Description Links Attribution
7.1.1 Introduction to object detection video slides [2]
7.1.2 Object detection performance metrics video slides [2]
7.1.3 Object detection models video slides [2]
7.1.4 Training an object detection model video slides [2]
7.1.5 Digging deeper into object detection doc [2]
Exercises and Problems
ID Description Links Attribution
7.1.6 Example assessment questions doc [2]

Section 2: Image Segmentation and Constrained Object Detection

Lecture Material
ID Description Links Attribution
7.2.1 Image segmentation video slides [2]
7.2.2 Multi-stage Inference Demo video [2]
7.2.3 Reusing Representations with Mat Kelcey video [2]
7.2.4 tinyML Talks: Constrained Object Detection on Microcontrollers with FOMO video
Exercises and Problems
ID Description Links Attribution
7.2.5 Project: Deploy an object detection model doc [2]
7.2.6 Example assessment questions doc [2]

Section 3: Responsible AI

Lecture Material
ID Description Links Attribution
7.3.1 Dataset collection slides [3]
7.3.2 The many faces of bias doc [3]
7.3.3 Biased datasets slides [3]
7.3.4 Model fairness slides [3]
Exercises and Problems
ID Description Links Attribution
7.3.5 Google what if tool colab [3]
7.3.6 Example assessment questions doc [3]

Module 8: Keyword Spotting

In this module, we create a functioning keyword spotting (also known as "wake word detection") system. To do so, we must introduce several concepts unique to audio digital signal processing and combine it with image classification techniques.

Learning Objectives

  1. Describe how machine learning can be used to classify sounds
  2. Describe how sound classification can be used to solve problems
  3. Describe the major components in a keyword spotting system
  4. Demonstrate the ability to train and deploy a sound classification system

Section 1: Audio Classification

Lecture Material
ID Description Links Attribution
8.1.1 Introduction to audio classification slides [1]
8.1.2 Audio data capture slides [1]
8.1.3 What is keyword spotting slides [3]
8.1.4 Keyword spotting challenges slides [3]
8.1.5 Keyword spotting application architecture doc [3]
8.1.6 Keyword spotting datasets slides [3]
8.1.7 Keyword spotting dataset creation doc [3]
Exercises and Problems
ID Description Links Attribution
8.1.8 Example assessment questions doc [1] [3]

Section 2: Spectrograms and MFCCs

Lecture Material
ID Description Links Attribution
8.2.1 Keyword spotting data collection slides [3]
8.2.2 Spectrograms and mfccs doc [3]
8.2.3 Keyword spotting model slides [3]
8.2.4 Audio feature extraction slides [1]
8.2.5 Review of convolutional neural networks slides [1]
8.2.6 Modifying the neural network slides [1]
Exercises and Problems
ID Description Links Attribution
8.2.7 Spectrograms and mfccs colab [3]
8.2.8 Example assessment questions doc [1] [3]

Section 3: Deploying a Keyword Spotting System

Lecture Material
ID Description Links Attribution
8.3.1 Deploy audio classifier slides [1]
8.3.2 Implementation strategies slides [1]
8.3.3 Metrics for keyword spotting slides [3]
8.3.4 Streaming audio slides [3]
8.3.5 Cascade architectures slides [3]
8.3.6 Keyword spotting in the big picture doc [3]
Exercises and Problems
ID Description Links Attribution
8.3.7 Project: Sound classification doc [1]
8.3.8 Example assessment questions doc [1] [3]

Attribution

[1] Slides and written material for "Introduction to Embedded Machine Learning" by Edge Impulse is licensed under CC BY-NC-SA 4.0

[2] Slides and written material for "Computer Vision with Embedded Machine Learning" by Edge Impulse is licensed under CC BY-NC-SA 4.0

[3] Slides and written material for "TinyML Courseware" by Prof. Vijay Janapa Reddi of Harvard University is licensed under CC BY-NC-SA 4.0

More Repositories

1

example-esp32-cam

Builds and runs an exported image classification impulse on ESP32 Cam
71
star
2

firmware-espressif-esp32

Edge Impulse firmware for the Espressif ESP-EYE(ESP32) Development board
C
63
star
3

inferencing-sdk-cpp

Portable C++ library for signal processing and machine learning inferencing
C
60
star
4

balena-cam-tinyml

Image classification on Raspberry Pi powered by Balena and Edge Impulse
JavaScript
49
star
5

firmware-arduino-nano-33-ble-sense

Edge Impulse firmware for the Arduino Nano 33 BLE Sense development board
C
49
star
6

mobile-client

Mobile acquisition and inferencing client
CSS
46
star
7

voice-activated-microbit

Bleep, bloop, I'm a computer that responds to your voice
C
34
star
8

firmware-pi-rp2040

Ingestion & inferencing firmware for the Raspberry Pi Pico (RP2040)
C
33
star
9

expert-projects

Python
32
star
10

linux-sdk-python

Use Edge Impulse for Linux models from Python
Python
32
star
11

firmware-st-b-l475e-iot01a

Edge Impulse firmware for the ST B-L475E-IOT01A development board
C
26
star
12

firmware-nordic-nrf52840dk-nrf5340dk

Edge Impulse firmware for nRF52840 DK and nRF5340 DK
C
24
star
13

processing-blocks

Signal processing blocks
Python
24
star
14

tflite-find-arena-size

FInd arena size for TensorFlow Lite models
C
24
star
15

example-standalone-inferencing

Builds and runs an exported impulse locally (C++)
Makefile
22
star
16

ingestion-sdk-c

Portable header-only library written in C99 for data collection on embedded devices.
C
19
star
17

workshop-advantech-jetson-nano

Tutorial for the PERFECTING FACTORY 5.0 WITH EDGE-POWERED AI workshop
Python
19
star
18

firmware-syntiant-tinyml

Edge Impulse firmware for Syntiant TinyML board
C
19
star
19

example-standalone-inferencing-zephyr

CMake
17
star
20

example-portenta-lorawan

Computer vision over LoRaWAN with the Portenta H7
C
17
star
21

firmware-nordic-thingy53

Official Edge Impulse firmware for Nordic Semiconductor Thingy:53
C
17
star
22

example-standalone-inferencing-linux

Builds and runs an exported impulse locally (Linux)
C++
17
star
23

example-SparkFun-MicroMod-nRF52840

C
15
star
24

example-standalone-inferencing-espressif-esp32

Builds and runs an exported impulse locally (ESP IDF)
C++
13
star
25

edge-impulse-cli

Command line interface tools for Edge Impulse
TypeScript
12
star
26

firmware-silabs-thunderboard-sense-2

Edge Impulse firmware for the SiLabs Thunderboard Sense 2
C
12
star
27

esp32-platformio-edge-impulse-standalone-example

Minimal example code for running an Edge Impulse designed neural network on an ESP32 dev kit using platformio
C++
12
star
28

example-lacuna-ls200

Demo bird songs detection application with Lacuna Space's LS200 and Arduino Nano 33 BLE Sense
C++
12
star
29

firmware-nordic-thingy91

Edge Impulse firmware for Nordic Thingy91
C
11
star
30

firmware-seeed-grove-vision-ai

C
9
star
31

firmware-himax-we-i-plus

Ingestion & inferencing firmware for the HiMax WE 1 target
C
9
star
32

workshop-devsummit2021-portenta

ARM DevSummit workshop with Portenta H7
C
9
star
33

example-custom-ml-block-keras

Custom Keras ML block example for Edge Impulse
Python
9
star
34

edge-impulse-linux-cli

Data acquisition and impulse runner CLI tools for Edge Impulse for Linux
TypeScript
8
star
35

firmware-sony-spresense

Edge Impulse firmware for the Sony Spresense development board
C
8
star
36

autopilot

Python
7
star
37

yolov5

YOLOv5 transfer learning model for Edge Impulse
Python
7
star
38

firmware-ti-launchxl

Edge Impulse firmware for TI LAUNCHXL-CC1352P1 development board
C
6
star
39

example-standalone-inferencing-c

Builds and runs an exported impulse locally (C)
Makefile
6
star
40

example-azure-sphere-mt3620

C
6
star
41

example-standalone-inferencing-pico

Builds and runs an exported impulse locally (Raspberry Pi Pico/RP2040)
CMake
6
star
42

example-signal-from-rgb565-frame-buffer

Create an Edge Impulse signal_t struct from your RGB565 frame buffer to run ML on embedded cameras
C
6
star
43

example-sparkfun-ccs811-himax

Example application for Sparkfun-Himax workshop with CCS811 sensor
C
6
star
44

workshop-arduino-tinyml-roshambo

C++
6
star
45

build-deploy

GitHub Action to build and deploy model
TypeScript
5
star
46

firmware-silabs-xg24

C
5
star
47

tool-data-collection-csv

C++
5
star
48

firmware-eta-compute-ecm3532

Edge Impulse firmware for the Eta Compute ECM3532 AI Sensor development board
C
5
star
49

example-standalone-inferencing-mbed

Builds and runs an exported impulse locally (Mbed OS)
C++
5
star
50

example-custom-ml-block-pytorch

Custom PyTorch ML block example for Edge Impulse
Python
4
star
51

example-custom-processing-block-python

Example of a custom processing block in Python
Python
4
star
52

example-inferencing-generic-node

C
4
star
53

sklearn-linear-models

Python
4
star
54

workshop-iotc-arduino-portenta

C
4
star
55

example-standalone-inferencing-spresense

Builds and runs an exported impulse locally (Sony Spresense)
C
4
star
56

example-custom-spectral-dsp-block

Example of a custom DSP block with C++ implementation
Python
4
star
57

python-sdk

Python
4
star
58

demo-shower-timer

Shower timer with Machine Learning and WebAssembly
CSS
4
star
59

example-multicore-inferencing-pico

C++
4
star
60

example-dataforwarder-mbed

Mbed OS example for the Data Forwarder
C
4
star
61

firmware-alif-e7

Edge Impulse machine learning firmare for Alif E7 target
C
3
star
62

ei-install-scripts

Set up an environment for running the Edge Impulse CLI
PowerShell
3
star
63

ei-spresense-4g-wildlife-camera

A Smart wildlife camera implementation with 4G connectivity, GPS and AI powered classification with Edge Impulse and Sony Spresense
C++
3
star
64

bird-data-download

Python
3
star
65

firmware-arduino-nicla-vision

Edge Impulse firmware for the Arduino Nicla Vision development board
C
3
star
66

example-standalone-inferencing-ecm3532

Builds and runs an exported impulse locally (Eta Compute ECM3532)
C
3
star
67

linux-sdk-go

Use Edge Impulse for Linux models from Go
Go
3
star
68

firmware-arduino-nicla-voice

Repository for the Arduino Nicla Voice
C++
3
star
69

example-transform-Dall-E-images

Use the Dall-E API to generate an image dataset
Python
3
star
70

example-linux-with-twilio

Object detection which sends a message over Twilio
CSS
2
star
71

firmware-arduino-portenta-h7

Edge Impulse firmware for the Arduino Portenta H7 development board
C
2
star
72

workshop-privacy-friendly-adverstising-panel

Python
2
star
73

example-standalone-inferencing-ti-launchxl

Builds and runs an exported impulse locally (Texas Instruments LaunchXL)
Makefile
2
star
74

ei-ti-code-composer-examples

Examples and walkthroughs integrating Edge Impulse with TI Code Composer projects
C++
2
star
75

workshop-arduino-qoitech-power-consumption

Monitor the power consumption in an embedded ML solution
C++
2
star
76

mtb-example-edgeimpulse-standalone-inference

Code example for ModusToolbox that enables any PSoC 6 to run Machine Learning
C
2
star
77

example-standalone-inferencing-alif

Builds and runs an exported impulse locally for Alif HW
CMake
2
star
78

firmware-nordic-nrf9160dk

Edge Impulse firmware for the nRF9160 DK
C
2
star
79

edge-impulse-omniverse-ext

Edge Impulse Omniverse Extensions
Python
2
star
80

workshop-people-counter-portenta-lorawan

Workshop for TheThingConference, build a people counter using the Arduino Portenta's camera and send the inference results over LoRaWan
Python
2
star
81

notebooks

Notebooks using the Edge Impulse libraries
Jupyter Notebook
2
star
82

integration-tests-firmware

Integration tests for the Edge Impulse remote management and serial protocol
TypeScript
2
star
83

workshop-image-data-augmentation

Jupyter Notebook
2
star
84

example-golioth

C++
2
star
85

conveyor-counting-data-synthesis-demo

Repository of files related to the conveyor belt object counting demo which uses a synthetic dataset
Jupyter Notebook
2
star
86

example-standalone-inferencing-silabs

C++
1
star
87

example-custom-ml-block-scikit

Scikit learn block
Python
1
star
88

mtb-example-edgeimpulse-keyword-spotting

Code example for ModusToolbox using the EdgeImpulse firmware for the PSoC 062 BLE Pioneer Kit
C
1
star
89

edgeimpulse-nodered

Integrate ML models trained in Edge Impulse with Node-RED
JavaScript
1
star
90

example-akida-custom-blocks

Edge Impulse custom blocks - giving early access to train and deploy akida spiking NNs
Python
1
star
91

workshop-silabs-xg24-dev-kit

Workshop using SiLabs xG24 Dev Kit
1
star
92

example-ti-msp432-mcu-motion

Example for TI MSP432â„¢ MCU with Educational BoosterPack MKII for motion recognition with Edge Impulse
C++
1
star
93

edge-detection-processing-block

Processing block for Edge detection in images
Python
1
star
94

edge-impulse-emd-feature-dsp-block

Implements the EMD based feature extraction algorithm described in by the paper - Feature Extraction and Reduction Applied to Sensorless Drive Diagnosis - as an edge-impulse custom DSP block
C++
1
star
95

mtb-example-edgeimpulse-continuous-motion

Code example for ModusToolbox using the EdgeImpulse firmware for the PSoC 62S2 Pioneer Kit
C
1
star
96

example-transform-text-to-speech

Use Google text-to-speech API to generate new keywords
JavaScript
1
star
97

example-transform-block-mix-noise

Transformation block that mixes custom background noise into audio files
Shell
1
star
98

example-ingestion-jpg

Example on how to upload JPG data to the ingestion service
JavaScript
1
star
99

workshop-pose-estimation

Jupyter Notebook
1
star
100

workshop-fomo-azure-counter

C++
1
star