• Stars
    star
    129
  • Rank 279,262 (Top 6 %)
  • Language
    Python
  • License
    Other
  • Created over 3 years ago
  • Updated about 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.

TIDL - TI Deep Learning Product

TIDL is a comprehensive software product for acceleration of Deep Neural Networks (DNNs) on TI's embedded devices. It supports heterogeneous execution of DNNs across cortex-A based MPUs, TI’s latest generation C7x DSP and TI's DNN accelerator (MMA). TIDL is released as part of TI's Software Development Kit (SDK) along with additional computer vision functions and optimized libraries including OpenCV. TIDL is available on a variety of embedded devices from Texas Instruments.

TIDL is a fundamental software component of TI’s Edge AI solution. TI's Edge AI solution simplifies the whole product life cycle of DNN development and deployment by providing a rich set of tools and optimized libraries. DNN based product development requires two main streams of expertise:

  • Data Scientists, who can design and train DNNs for targeted applications
  • Embedded System Engineers, who can design and develop inference solutions for real time execution of DNNs on low power embedded device

TI's Edge AI solution provides the right set of tools for both of these categories:

  • Edge AI Studio: Integrated development environment for development of AI applications for edge processors, hosting tools like Model Composer to train, compile and deploy models with click of mouse button and Model Analyzer to let you evaluate and analyze deep learning model performance on TI devices from your browser in minutes
  • Model zoo: A large collection of pre-trained models for data scientists, which along with TI's Model Selection Tool enables picking the ideal model for TI's embedded devices
  • Training and quantization tools for popular frameworks, allowing data scientists to make DNNs more suitable for TI devices
  • Edge AI Benchmark: A python based framework which can allow you to perform accuracy and performance benchmark. Accuracy benchmark can be performed without development board, but for performance benchmark, a development board is needed.
  • Edge AI TIDL Tools: Edge AI TIDL Tools provided in this repository shall be used for model compilation on X86. Artifacts from compilation process can used for Model inference. Model inference can happen on X86 machine (host emulation mode) or on development board with TI SOC. This repository also provides examples to be directly used on X86 target and can be used on development board with TI SOC. For deployment and execution on the development board, one has to use this package.

The figure below illustrates the work flow of DNN development and deployment on TI devices:

TI EdgeAI Work Flow

EdgeAI TIDL Tools

Introduction

TIDL provides multiple deployment options with industry defined inference engines as listed below. These inference engines are being referred as Open Source Runtimes (OSRT) in this document.

  • TFLite Runtime: TensorFlow Lite based inference with heterogeneous execution on cortex-A** + C7x-MMA, using TFlite Delegates TFLite Delgate API
  • ONNX RunTime: ONNX Runtime based inference with heterogeneous execution on cortex-A** + C7x-MMA.
  • TVM/Neo-AI RunTime: TVM/Neo-AI-DLR based inference with heterogeneous execution on cortex-A** + C7x-MMA

** AM68PA has cortex-A72 as its MPU, refer to the device TRM to know which cortex-A MPU it contains.

These heterogeneous execution enables:

  1. OSRT as the top level inference for user applications
  2. Offloading subgraphs to C7x/MMA for accelerated execution with TIDL
  3. Runs optimized code on ARM core for layers that are not supported by TIDL

Edge AI TIDL Tools provided in this repository supports model compilation and model inference. The diagram below illustrates the TFLite based work flow as an example. ONNX Runtime and TVM/Neo-AI Runtime also follows similar work flow.

The below table covers the supported operations with this repository on X86_PC and TI's development board.

Operation X86_PC TI SOC Python API CPP API
Model Compilation ✔️ ✔️
Model Inference ✔️ ✔️ ✔️ ✔️

What IS Supported

  • Benchmark latency and Memory bandwidth of out of box example models (10+)
  • Compile user / custom model for deployment with TIDL
  • Inference of compiled models on X86_PC or TI SOC using file base input and output

What IS NOT Supported

  • Camera , Display and DL runtime based end-to-end pipeline development or benchmarking.
  • Benchmarking accuracy of models using TIDL acceleration with standard datasets, for e.g. - accuracy benchmarking using MS COCO dataset for object detection models.

Supported Devices

  • Following table shows the devices supported by this repository
  • Device with hardware acceleration have TI-DSP and MMA(Matrix Multiplier Accelerator) for faster execution.
Device Family(Product) Environment Variable Hardware Acceleration
AM62 am62
AM62A am62a ✔️
AM67A am67a ✔️
AM68PA am68pa ✔️
AM68A am68a ✔️
AM69A am69a ✔️
J721E (TDA4VM) am68pa ✔️
J721S2 (TDA4AL, TDA4VL) am68a ✔️
J722S am67a ✔️
J784S4 (TDA4AP, TDA4VP,
TDA4AH, TDA4VH)
am69a ✔️

Setup

Note Please select / checkout to the tag compatible with the SDK version that you are using with the TI's Evaluation board before continuing on the below steps. Refer to SDK Version compatibility Table for the tag of your SDK version

Pre-requisites to setup on x86_PC

  • X86_PC mode for this repository is validated with below configuration:
OS Python Version
Ubuntu 22.04 3.10
  • We recommend Docker based setup to avoid dependency issues

  • Tools built with GPU acceleration need to be run inside the appropriate Docker container. Refer to relevant steps to build and run the container

Setup on X86_PC

  • Run below one time setup for system level packages. This needs sudo permission, get it installed by your system administrator if required.
  sudo apt-get install libyaml-cpp-dev
  • Make sure you have all permission for the current directory before proceeding
  • Run the below commands to install the dependent components on your machine and set all the required environments

Note source in the setup command is important as this script is exporting all required environment variables. Without this, user may encounter some compilation/runtime issues

git clone https://github.com/TexasInstruments/edgeai-tidl-tools.git
cd edgeai-tidl-tools
git checkout <TAG Compatible with your SDK version>
# Supported SOC name strings am62, am62a, am68a, am68pa, am69a, am67a
export SOC=<Your SOC name>
source ./setup.sh
  • While opening new terminal in a system where above setup is already done once for a given SDK version, set below environment variables
cd edgeai-tidl-tools
export SOC=<Your SOC name>
export TIDL_TOOLS_PATH=$(pwd)/tidl_tools
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$TIDL_TOOLS_PATH
export ARM64_GCC_PATH=$(pwd)/gcc-arm-9.2-2019.12-x86_64-aarch64-none-linux-gnu

Validate and Benchmark out-of-box examples

  • We provide 10+ out-of-box examples for model compilation on X86_PC and Inference on X86_PC and TI SOC in the below category of tasks. Refer Model zoo for complete set of validated models across multiple categories
    • Image classification
    • Object detection
    • Pixel level semantic Segmentation

Compile and Validate on X86_PC

  • Execute below to compile and run inference of the model in X86_PC
    • Inference is validated with both Python and CPP APIs
mkdir build && cd build
cmake ../examples && make -j && cd ..
source ./scripts/run_python_examples.sh
python3 ./scripts/gen_test_report.py
  • The execution of above step will generate compiled-model artifacts and output images at ./edgeai-tidl-tools/output_images. These outputs images can be compared against the expected outputs in /edgeai-tidl-tools/test_data/refs-pc-{soc}, this confirms successful installation / setup on PC
model-artifacts/
models/
output_images/
test_report_pc.csv
  • An output image can be found for each model in the'output_images' folder, similar to what's shown below
Image Classification Object detection Semantic Segmentation

Benchmark on TI SOC

  • Prepare the development board by following the below steps
 git clone https://github.com/TexasInstruments/edgeai-tidl-tools.git
 cd edgeai-tidl-tools
 git checkout <TAG Compatible with your SDK version>
 export SOC=<Your SOC name>
 export TIDL_TOOLS_PATH=$(pwd)
  • Copy the compiled artifacts from X86_PC to Development boards file system at ./edgeai-tidl-tools/
  • Execute below to run inference on target development board with both Python and CPP APIs
# scp -r <pc>/edgeai-tidl-tools/model-artifacts/  <dev board>/edgeai-tidl-tool/
# scp -r <pc>/edgeai-tidl-tools/models/  <dev board>/edgeai-tidl-tool/
mkdir build && cd build
cmake ../examples && make -j && cd ..
python3 ./scripts/gen_test_report.py
  • The execution of above step will generate output images at ./edgeai-tidl-tools/output_images. These outputs images can be compared against the expected outputs in /edgeai-tidl-tools/test_data/refs-{soc}. This confirms successful installation / setup on board.

Compile and Benchmark Custom Model

  • New Model Evaluation : Refer this if your model falls into one of supported out-of-box example tasks categories such as image classification, object detection or pixel level semantic Segmentation
  • Custom Model Evaluation : Refer this section if your custom model doesn't fall in supported example task category or input and output format is different from the supported list of tasks
  • Reporting issues with Model deployment - Refer notes here for reporting issues in custom model deployment

User Guide

License

Please see the license under which this repository is made available: LICENSE

More Repositories

1

edgeai

Edge AI Software and Development Tools
Shell
127
star
2

jacinto-ai-devkit

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
86
star
3

mcupsdk-core

TI MCU+ SDK core source code repository with drivers, protocol stacks and example applications
C
38
star
4

ble-sdk-210-extra

(Depricated!) Examples are now located in the ble_examples repository.
C
36
star
5

edgeai-mmdetection

Train Lite (Embedded Friendly) Object Detection models using https://github.com/open-mmlab/mmdetection
Python
35
star
6

simplelink-lowpower-f2-sdk

SimpleLink Low Power F2 SDK
C
26
star
7

ble_examples

Additional examples to compliment TI's Bluetooth Low Energy Stack offerings.
HTML
22
star
8

mspm0-sdk

Git version of Texas Instrument's MSPM0 SDK
C
21
star
9

edgeai-gst-apps

Gstreamer based Edge AI reference application
C++
20
star
10

edgeai-tensorlab

Edge AI Model Development Tools
Jupyter Notebook
19
star
11

edgeai-gst-plugins

Repository to host GStreamer plugins for TI's EdgeAI class of devices
C
18
star
12

matter

Texas Instruments fork of the Connectivity Standards Alliance connectedhomeip repository
C++
14
star
13

HOGP-BLE-HID-EXAMPLE

HOGP (HID Over GATT Profile) BLE Example for the CC26X2 devices.
C
12
star
14

c2000ware-core-sdk

Repository for C2000Ware
C
12
star
15

tiovx

TI's implementation of the OpenVX standard.
C
10
star
16

ti-bdebstrap

Build custom bootstrap images using bdebstrap
Shell
9
star
17

azure-iot-pal-simplelink

Adaptation layer for the Azure IoT SDK for TI's SimpleLink devices
C
8
star
18

ti-wisunfantund

TI Userspace network Daemon
C++
8
star
19

dri3wsegl

DRI3 WSEGL plugin for PVR driver
C
8
star
20

edgeai-tiovx-modules

Repository to host TI's OpenVx modules used in the EdgeAI SDK.
C
8
star
21

edgeai-app-stack

Repo which installs other Edge AI repos to build on PC and install on a target file system
Makefile
8
star
22

ti-gpio-py

A Linux based Python library for TI GPIO RPi header enabled platforms
HTML
8
star
23

ti-wisunfan-pyspinel

Python Host Interface Software for TI Wi-SUN FAN Software. The underlying interface is adapted from OpenThread SPINEL interface (https://github.com/openthread/pyspinel)
Python
8
star
24

tensorflow-lite-micro-examples

C++
7
star
25

c2000ware-FreeRTOS

This repository contains FreeRTOS kernel source/header files , kernel ports and demos for C2000 devices
C
7
star
26

ot-ti

TI-Openthread
C
6
star
27

CC26XX-TCAN4550-EXAMPLES

CC26XX CAN Examples
C
6
star
28

edgeai-modeloptimization

edgeai-modeltoolkit
Python
5
star
29

edgeai-robotics-demos

Python
5
star
30

ti-debpkgs

Apt repository for Texas Instruments
5
star
31

edgeai-tiovx-apps

Reference OpenVx based applications
C
5
star
32

simplelink-connect

TypeScript
4
star
33

cc32xx_open_sdk

C
4
star
34

ti-ethernet-software

Repository for Ethernet PHY drivers for Linux and RTOS.
C
4
star
35

edgeai-demo-monodepth-estimation

Single camera depth estimation using MiDaS deep learning CNN and gstreamer image processing pipeline
Python
3
star
36

debian-repos

Debian package builder scripts
Roff
3
star
37

simplelink-ble5stack-examples

HTML
3
star
38

mcupsdk-setup

Scripts to install development environment for TI MCU+ SDK
Shell
3
star
39

simplelink-ti_sidewalk-examples

HTML
3
star
40

ti-gpio-cpp

A Linux based CPP library for TI GPIO RPi header enabled platforms
C++
3
star
41

mcupsdk-core-k3

TI MCU+ SDK core source code repository with drivers and example applications for K3 family of MPUs.
C
3
star
42

simplelink-lowpower-f3-sdk

Texas Instrument’s SimpleLink Low Power F3 Software Development Kit
C
3
star
43

ind-comms-sdk

C
3
star
44

edgeai-benchmark

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
Python
3
star
45

tinyml-tensorlab

MCU Analytics / ML Model Development Software
Python
3
star
46

mcupsdk-enet-lld

Unified Ethernet Low-Level Driver (Enet LLD) for the different Ethernet peripherals found in Sitara MCU+ class of devices. Part of TI MCU+ SDK
C
3
star
47

Beyond-SDK

Makefile
2
star
48

motor-control-sdk

C
2
star
49

simplelink-connect-fw-bins

HTML
2
star
50

edgeai-modelzoo

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
Python
2
star
51

enet-tsn-stack

C
2
star
52

simplelink-prop_rf-examples

Texas Instrument’s SimpleLink Low Power Proprietary RF Examples
HTML
2
star
53

c2000ware-c2000-academy

Repository of C2000 Academy lab exercises
Batchfile
2
star
54

simplelink-zstack-examples

ZStack (Zigbee) Examples
HTML
2
star
55

ti-apps-launcher

QT based application launcher for TI Platforms
QML
1
star
56

edgeai-modelmaker

This repository has been moved. The new location is in https://github.com/TexasInstruments/edgeai-tensorlab
Python
1
star
57

edgeai-studio-agent

Edge AI Studio device agent for TI devices
Python
1
star
58

mcupsdk-sysconfig

Sysconfig metadata repo for mcupsdk driver modules
JavaScript
1
star
59

mcupsdk-manifests

git-repo XML manifests to setup source code directory structure for TI MCU+ SDK
1
star
60

simplelink-dmm-examples

HTML
1
star
61

swol

Single Wire Output Logger
HTML
1
star
62

tda4x-robotarm-demos

Niryo robotic arm demo on the TDA4VM
Shell
1
star
63

simplelink-ti154stack-examples

HTML
1
star
64

simplelink-ti_wisunfan-examples

TI WiSunFan Examples
HTML
1
star
65

seva-browser

JavaScript
1
star
66

edgeai-keyword-spotting

Arm-based keyword spotting examples on live data using python3 in processors Linux SDK
HTML
1
star
67

tinyml

MCU Analytics / ML Model Development Software
Shell
1
star
68

ti-docker-images

This repository provides a ubuntu 22.04 based docker image with all the packages that are required for Yocto Builds. The docker image can also be used to Install & Build via Top Level Makefile from sources for TI Arm based microprocessors
Dockerfile
1
star