• Stars
    star
    330
  • Rank 127,657 (Top 3 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created about 6 years ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Facial Expression Recognition with a deep neural network as a PyPI package

FER

Facial expression recognition.

image

PyPI version Build Status Downloads

Open In Colab

DOI

INSTALLATION

Currently FER only supports Python 3.6 onwards. It can be installed through pip:

$ pip install fer

This implementation requires OpenCV>=3.2 and Tensorflow>=1.7.0 installed in the system, with bindings for Python3.

They can be installed through pip (if pip version >= 9.0.1):

$ pip install tensorflow>=1.7 opencv-contrib-python==3.3.0.9

or compiled directly from sources (OpenCV3, Tensorflow).

Note that a tensorflow-gpu version can be used instead if a GPU device is available on the system, which will speedup the results. It can be installed with pip:

$ pip install tensorflow-gpu\>=1.7.0

To extract videos that includes sound, ffmpeg and moviepy packages must be installed with pip:

$ pip install ffmpeg moviepy 

USAGE

The following example illustrates the ease of use of this package:

from fer import FER
import cv2

img = cv2.imread("justin.jpg")
detector = FER()
detector.detect_emotions(img)

Sample output:

[{'box': [277, 90, 48, 63], 'emotions': {'angry': 0.02, 'disgust': 0.0, 'fear': 0.05, 'happy': 0.16, 'neutral': 0.09, 'sad': 0.27, 'surprise': 0.41}]

Pretty print it with import pprint; pprint.pprint(result).

Just want the top emotion? Try:

emotion, score = detector.top_emotion(img) # 'happy', 0.99

MTCNN Facial Recognition

Faces by default are detected using OpenCV's Haar Cascade classifier. To use the more accurate MTCNN network, add the parameter:

detector = FER(mtcnn=True)

Video

For recognizing facial expressions in video, the Video class splits video into frames. It can use a local Keras model (default) or Peltarion API for the backend:

from fer import Video
from fer import FER

video_filename = "tests/woman2.mp4"
video = Video(video_filename)

# Analyze video, displaying the output
detector = FER(mtcnn=True)
raw_data = video.analyze(detector, display=True)
df = video.to_pandas(raw_data)

The detector returns a list of JSON objects. Each JSON object contains two keys: 'box' and 'emotions':

  • The bounding box is formatted as [x, y, width, height] under the key 'box'.
  • The emotions are formatted into a JSON object with the keys 'anger', 'disgust', 'fear', 'happy', 'sad', surprise', and 'neutral'.

Other good examples of usage can be found in the files demo.py located in the root of this repository.

To run the examples, install click for command line with pip install click and enter python demo.py [image|video|webcam] --help.

TF-SERVING

Support running with online TF Serving docker image.

To use: Run docker-compose up and initialize FER with FER(..., tfserving=True).

MODEL

FER bundles a Keras model.

The model is a convolutional neural network with weights saved to HDF5 file in the data folder relative to the module's path. It can be overriden by injecting it into the FER() constructor during instantiation with the emotion_model parameter.

LICENSE

MIT License.

CREDIT

This code includes methods and package structure copied or derived from Ivรกn de Paz Centeno's implementation of MTCNN and Octavio Arriaga's facial expression recognition repo.

REFERENCE

FER 2013 dataset curated by Pierre Luc Carrier and Aaron Courville, described in:

"Challenges in Representation Learning: A report on three machine learning contests," by Ian J. Goodfellow, Dumitru Erhan, Pierre Luc Carrier, Aaron Courville, Mehdi Mirza, Ben Hamner, Will Cukierski, Yichuan Tang, David Thaler, Dong-Hyun Lee, Yingbo Zhou, Chetan Ramaiah, Fangxiang Feng, Ruifan Li, Xiaojie Wang, Dimitris Athanasakis, John Shawe-Taylor, Maxim Milakov, John Park, Radu Ionescu, Marius Popescu, Cristian Grozea, James Bergstra, Jingjing Xie, Lukasz Romaszko, Bing Xu, Zhang Chuang, and Yoshua Bengio, arXiv:1307.0414.

More Repositories

1

party-pi

Computer vision emotion ๐Ÿ˜œ detection game in Flask with TensorFlow backend.
JavaScript
28
star
2

sensei

Posture ๐Ÿ‘ค Monitor App and Analytics
Jupyter Notebook
28
star
3

simages

Find duplicates and similar images in a folder
Python
22
star
4

sheet-copier

Chrome Extension to Make a Google Sheet Copyable
JavaScript
17
star
5

deepemotion

Flask app for emotion detection in video
JavaScript
14
star
6

video-pose-extractor

Dockerfile and instructions for human pose estimation implementation using Caffe, OpenCV 3.1.0 and Python 2.7.
Python
12
star
7

genre-melodies

Create genre-specific melodies using Magenta
Jupyter Notebook
9
star
8

wg-activator

Update your WG-Gesucht listing every hour
Python
6
star
9

pydata

Resources for Talks
4
star
10

emojivis

Testing repository for Face to Emoji using Webcam
Python
3
star
11

sonic-face

Face-activated Music ๐ŸŽต Loops using Sonic Pi + OpenCV
Jupyter Notebook
3
star
12

digits-classifier

Testing repository for digit classification convolutional network
Jupyter Notebook
2
star
13

Walabot-PosturePal

Posture Monitor with Android App and Walabot
Java
2
star
14

timeshade

Python
1
star
15

gitcommit

Scrape GitHub user activity times
JavaScript
1
star
16

normaliz-notebook

Testing repository for Normaliz Tutorial
Jupyter Notebook
1
star
17

sornvis

Visualize self-organizing neural network
Jupyter Notebook
1
star
18

closely

Python library for finding the closest pairs in an array
Python
1
star
19

learnthestructure

Implementation of libpgm for the Wisconsin Breast Cancer Database
Python
1
star
20

ar-loops

Augmented Reality Sound Controller using Unity and Sonic Pi
Ruby
1
star
21

oshug

Intel Movidius NCS Workshop for Open Source Hardware User Group
Jupyter Notebook
1
star
22

HarryPython

Learn Python with HarryPython
Jupyter Notebook
1
star
23

Agreed-chrome-extension

Slack "Agree" hotkey as a Chrome extension
JavaScript
1
star
24

dotfiles

๐Ÿ’  Some configuration files
Vim Script
1
star