• Stars
    star
    936
  • Rank 48,823 (Top 1.0 %)
  • Language
    Swift
  • License
    MIT License
  • Created over 7 years ago
  • Updated almost 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Tiny YOLO for iOS implemented using CoreML but also using the new MPS graph API.

YOLO with Core ML and MPSNNGraph

This is the source code for my blog post YOLO: Core ML versus MPSNNGraph.

YOLO is an object detection network. It can detect multiple objects in an image and puts bounding boxes around these objects. Read my other blog post about YOLO to learn more about how it works.

YOLO in action

Previously, I implemented YOLO in Metal using the Forge library. Since then Apple released Core ML and MPSNNGraph as part of the iOS 11 beta. So I figured, why not try to get YOLO running on these two other technology stacks too?

In this repo you'll find:

  • TinyYOLO-CoreML: A demo app that runs the Tiny YOLO neural network on Core ML.
  • TinyYOLO-NNGraph: The same demo app but this time it uses the lower-level graph API from Metal Performance Shaders.
  • Convert: The scripts needed to convert the original DarkNet YOLO model to Core ML and MPS format.

To run the app, just open the xcodeproj file in Xcode 9 or later, and run it on a device with iOS 11 or better installed.

The reported "elapsed" time is how long it takes the YOLO neural net to process a single image. The FPS is the actual throughput achieved by the app.

NOTE: Running these kinds of neural networks eats up a lot of battery power. To measure the maximum speed of the model, the setUpCamera() method in ViewController.swift configures the camera to run at 240 FPS, if available. In a real app, you'd use at most 30 FPS and possibly limit the number of times per second it runs the neural net to 15 or less (i.e. only process every other frame).

Tip: Also check out this repo for YOLO v3. It works the same as this repo, but uses the full version of YOLO v3!

iOS 12 and VNRecognizedObjectObservation

The code in my blog post and this repo shows how take the MLMultiArray output from TinyYOLO and interpret it in your app. That was the only way to do it with iOS 11, but as of iOS 12 there is an easier solution.

The Vision framework in iOS 12 directly supports YOLO-like models. The big advantage is that these do the bounding box decoding and non-maximum suppression (NMS) inside the Core ML model. All you need to do is pass in the image and Vision will give you the results as one or more VNRecognizedObjectObservation objects. No more messing around with MLMultiArrays.

It's also really easy to train such models using Turi Create. It combines TinyYOLO v2 and the new NonMaximumSuppression model type into a so-called pipeline model.

The good news is that this new Vision API also supports other object detection models!

I added a chapter to my book Core ML Survival Guide that shows exactly how this works. In the book you’ll see how to add this same functionality to MobileNetV2 + SSDLite, so that you get VNRecognizedObjectObservation predictions for that model too. The book has lots of other great tips on using Core ML, so check it out! 😄

If you're not ready to go all-in on iOS 12 yet, then read on...

Converting the models

NOTE: You don't need to convert the models yourself. Everything you need to run the demo apps is included in the Xcode projects already.

If you're interested in how the conversion was done, there are three conversion scripts:

YAD2K

The original network is in Darknet format. I used YAD2K to convert this to Keras. Since coremltools currently requires Keras 1.2.2, the included YAD2K source code is actually a modified version that runs on Keras 1.2.2 instead of 2.0.

First, set up a virtualenv with Python 3:

virtualenv -p /usr/local/bin/python3 yad2kenv
source yad2kenv/bin/activate
pip3 install tensorflow
pip3 install keras==1.2.2
pip3 install h5py
pip3 install pydot-ng
pip3 install pillow
brew install graphviz

Run the yad2k.py script to convert the Darknet model to Keras:

cd Convert/yad2k
python3 yad2k.py -p ../tiny-yolo-voc.cfg ../tiny-yolo-voc.weights model_data/tiny-yolo-voc.h5

To test the model actually works:

python3 test_yolo.py model_data/tiny-yolo-voc.h5 -a model_data/tiny-yolo-voc_anchors.txt -c model_data/pascal_classes.txt 

This places some images with the computed bounding boxes in the yad2k/images/out folder.

coreml.py

The coreml.py script takes the tiny-yolo-voc.h5 model created by YAD2K and converts it to TinyYOLO.mlmodel. Note: this script requires Python 2.7 from /usr/bin/python (i.e. the one that comes with macOS).

To set up the virtual environment:

virtualenv -p /usr/bin/python2.7 coreml
source coreml/bin/activate
pip install tensorflow
pip install keras==1.2.2
pip install h5py
pip install coremltools

Run the coreml.py script to do the conversion (the paths to the model file and the output folder are hardcoded in the script):

python coreml.py

nngraph.py

The nngraph.py script takes the tiny-yolo-voc.h5 model created by YAD2K and converts it to weights files used by MPSNNGraph. Requires Python 3 and Keras 1.2.2.

More Repositories

1

neural-engine

Everything we actually know about the Apple Neural Engine (ANE)
2,049
star
2

CoreMLHelpers

Types and functions that make it a little easier to work with Core ML in Swift.
Swift
1,373
star
3

Forge

A neural network toolkit for Metal
Swift
1,270
star
4

MobileNet-CoreML

The MobileNet neural network using Apple's new CoreML framework
Swift
705
star
5

MHTabBarController

A custom tab bar controller for iOS 5
Objective-C
488
star
6

TensorFlow-iOS-Example

Source code for my blog post "Getting started with TensorFlow on iOS"
Swift
441
star
7

BlazeFace-PyTorch

The BlazeFace face detector model implemented in PyTorch
Jupyter Notebook
427
star
8

coreml-survival-guide

Source code for the book Core ML Survival Guide
Python
246
star
9

MHRotaryKnob

UIControl for iOS that acts like a rotary knob
Objective-C
196
star
10

VGGNet-Metal

iPhone version of the VGGNet convolutional neural network for image recognition
Swift
182
star
11

Swift-3D-Demo

Shows how to draw a 3D object without using shaders
Swift
180
star
12

synth-plugin-book

Source code for the book Code Your Own Synth Plug-Ins With C++ and JUCE
C++
172
star
13

MHLazyTableImages

This project is now deprecated.
Objective-C
157
star
14

SoundBankPlayer

Sample-based audio player for iOS that uses OpenAL.
Objective-C
156
star
15

reliability-diagrams

Reliability diagrams visualize whether a classifier model needs calibration
Jupyter Notebook
136
star
16

MHPagingScrollView

A UIScrollView subclass that shows previews of the pages on the left and right.
Objective-C
132
star
17

mda-plugins-juce

JUCE implementations of the classic MDA audio plug-ins
C
124
star
18

TheKissOfShame

DSP Magnetic Tape Emulation
C++
103
star
19

metal-gpgpu

Collection of notes on how to use Apple’s Metal API for compute tasks
101
star
20

coreml-training

Source code for my blog post series "On-device training with Core ML"
Jupyter Notebook
99
star
21

Inception-CoreML

Running Inception-v3 on Core ML
Swift
97
star
22

Matrix

A fast matrix type for Swift
Swift
94
star
23

AudioBufferPlayer

Class for doing simple iOS sound synthesis using Audio Queues.
Objective-C
85
star
24

MHNibTableViewCell

This code is now deprecated.
Objective-C
79
star
25

CoreML-Custom-Layers

Source code for the blog post "Custom Layers in Core ML"
Swift
72
star
26

InsideCoreML

Python script to examine Core ML's mlmodel files
Python
64
star
27

BNNS-vs-MPSCNN

Compares the speed of Apple's two deep learning frameworks: BNNS and Metal Performance Shaders
Swift
61
star
28

TransparentJPEG

Allows you to combine a JPEG with a second image to give it transparency.
Objective-C
59
star
29

TinyML-HelloWorld-ArduinoUno

The TinyML "Hello World" sine wave model on Arduino Uno v3
Jupyter Notebook
48
star
30

synth-recipes

Code snippets of sound synthesis algorithms in C++
C++
47
star
31

WashedOut

Color theme for Xcode 8 based on the colors from the WWDC 2016 slides
42
star
32

RNN-Drummer-Swift

Using a recurrent neural network to teach the iPhone to play drums
Python
42
star
33

BuildYourOwnLispInSwift

A simple LISP interpreter written in Swift
Swift
36
star
34

SemanticSegmentationMetalDemo

Drawing semantic segmentation masks with Metal
Swift
33
star
35

krunch

Lowpass filter + saturation audio effect plug-in
C++
32
star
36

MHSemiModal

Category on UIViewController that makes it easy to present modal view controllers that only partially cover the screen.
Objective-C
31
star
37

Deepfish

Live visualization of convolutional neural network using the iPhone's camera
Swift
24
star
38

MPS-Matrix-Multiplication

Playing with the Metal Performance Shaders matrix multiplication kernel
Swift
24
star
39

fft-juce

Example code for my blog post FFT Processing in JUCE
C++
23
star
40

MHPopoverManager

A simple class for managing the lifecycle of your UIPopoverControllers
Objective-C
23
star
41

sefr-swift

The SEFR classifier implemented in Swift
Swift
21
star
42

Railroad-Diagrams-Swift

Library for making railroad diagrams in Swift
Swift
19
star
43

AVBufferPlayer

Shows how to use AVAudioPlayer to play a buffer of waveform data that you give it.
Objective-C
17
star
44

GalaxyApocalypse

My January 2013 game for #OneGameADay (iPhone). The galaxy is falling apart and it's your job to move all the planets back to where they belong. Lots of swiping involved.
Objective-C++
15
star
45

airwindows-juce

JUCE versions of selected Airwindows plug-ins
C++
12
star
46

MHTintHelper

Tool that quickly lets you pick tint colors for navigation bars etc.
Objective-C
12
star
47

levels

Basic digital level meter plug-in.
C++
11
star
48

MHDatabase

A simple Objective-C wrapper around the sqlite3 functions.
Objective-C
11
star
49

Ignition

PyTorch helper code
Python
10
star
50

Logistic-Regression-Swift

A basic example of how to implement logistic regression in Swift
Swift
9
star
51

ShrinkPng

Simple tool for shrinking images 50% by averaging the color (and alpha) of each 2x2 pixel block.
Objective-C
9
star
52

MHOverlayWindow

A simple example of how to make a UIWindow that appears on top of everything else, including the status bar.
Objective-C
7
star
53

pumpkin

Everything must bounce!
Swift
7
star
54

MHOverride

Category on NSObject that lets you override methods on existing objects using blocks, without having to make a subclass.
Objective-C
7
star
55

ThreeBandEQ

Simple bass/mids/treble equalizer plugin written in JUCE
C++
5
star
56

bombaz

Simple bass synth VSTi based on window function synthesis
C++
5
star
57

RWDevCon-App-Architecture

Source code for my RWDevCon talk on app architecture.
Swift
3
star
58

MHMetaColors

Category that allows you to write, for example, [UIColor xFF3399] to make a new UIColor object with values #FF3399.
Objective-C
3
star
59

rubberneck

handy utility for monitoring levels and protecting ears while developing plug-ins
C++
3
star
60

RWDevCon-Swift-Closures-Generics

Source code for my RWDevCon talk on Swift closures and generics.
Swift
2
star
61

hollance

1
star
62

hollance.github.io

CSS
1
star