• Stars
    star
    1,063
  • Rank 43,433 (Top 0.9 %)
  • Language
    JavaScript
  • License
    Apache License 2.0
  • Created over 6 years ago
  • Updated 10 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

JavaScript/WebGL real-time face tracking and expression detection library. Build your own emoticons animated in real time in the browser! SVG and THREE.js integration demos are provided.

NOTICE: AppleΒ©'s lawyers threatened us to file a complain on the 21th of August 2019 for infringing their intellectual property. So we have replaced the 3D animated fox by a raccoon.

Indeed, AppleΒ© owns the intellectual property of 3D animated foxes (but not on raccoons yet). Thank you for your understanding.

JavaScript/WebGL library to detect and reproduce facial expressions

You can build your own animated emoticon embedded in your web application thanks to this library. The video is processed exclusively on the client-side.

The computing power of your GPU is important. If your GPU is powerful, many detections per second will be processed and the result will be smooth and accurate.

The face detection should work even if the lighting is not great. However, the better is the input image, the better is the face expressions detection. Here are some tips to get a good experience:

  • The face should be well enlighted: the nose, the eyes should be distinguishable,
  • Avoid backlighting: The background should be a wall, not a window,
  • The face should not be too far, neither too close to the camera: the face should ideally cover 1/3 of the camera height. It should be fully visible,
  • The camera should be placed in front of the user. A side view is not recommended,
  • Beards and mustaches can make mouth movement detection harder, and glasses can disturb eyes detection.

Table of contents

Features

  • face detection and tracking,
  • detects 11 facial expressions,
  • face rotation around the 3 axis,
  • robust to lighting conditions,
  • mobile friendly,
  • examples provided using SVG and THREE.js.

Architecture

  • /assets/: assets, both for 3D and 2D demonstrations (3D meshes, images),
  • /demos/: the most interesting: the demos!,
  • /dist/: heart of the library:
    • jeelizFaceExpressions.js: main minified script. It gets the camera video feed, exploit the neural network to detect the face and the expressions and stabilize the result,
    • jeelizFaceExpressionsNNC.json: neural network model loaded by the main script,
  • /doc/: some additionnal documentation,
  • /helpers/: The outputs of the main script are very raw. It is convenient to use these helpers to animate a 3D model with the THREE.js helper or a SVG file with the SVG helper. All demos use these helpers,
  • /libs/: some javascript libs,
  • /meshConverter/: only for the THREE.js use. Tool to build the 3D model file including morphs from separate .OBJ files.

Demonstrations

All the following demos are included in this repository, in the /demos path. You can try them:

If you have made an application or a fun demonstration using this library, we would love to check it out and add a link here! Just contact us on Twitter @Jeeliz_AR or LinkedIn.

Run locally

You just have to serve the content of this directory using a HTTPS server. Camera access can be not authorized depending on the web browser the application is hosted by an unsecured HTTP server. You can use Docker for example to run a HTTPS server:

  1. Run docker-compose
docker-compose up
  1. Open a browser and go to localhost:8888

If you have not bought a camera yet, a screenshot video of the Cartman Demo is available here:

Using module

/dist/jeelizFaceExpressions.module.js is exactly the same as /dist/jeelizFaceExpressions.js except that it works as JavaScript module, so you can import it directly using:

import 'dist/jeelizFaceExpressions.module.js'

or using require:

const faceExpressions = require('./lib/jeelizFaceExpressions.module.js')
//...

There is no demo using the module version yet.

Integration

With a bundler

If you use this library with a bundler (typically Webpack or Parcel), first you should use the module version.

Then, with the standard library, we load the neural network model (specified by NNCPath provided as initialization parameter) using AJAX for the following reasons:

  • If the user does not accept to share its camera, or if WebGL is not enabled, we don't have to load the neural network model,
  • We suppose that the library is deployed using a static HTTPS server.

With a bundler, it is a bit more complicated. It is easier to load the neural network model using a classical import or require call and to provide it using the NNC init parameter:

const faceExpressions = require('./lib/jeelizFaceExpressions.module.js')
const neuralNetworkModel = require('./dist/jeelizFaceExpressionsNNC.json')

faceExpressions.init({
  NNC: neuralNetworkModel, //instead of NNCPath
  //... other init parameters
});

With JavaScript frontend frameworks

We don't cover here the integration with mainstream JavaScript frontend frameworks (React, Vue, Angular). If you submit Pull Request adding the boilerplate or a demo integrated with specific frameworks, you are welcome and they will be accepted of course. We can provide this kind of integration as a specific development service ( please contact us here ). But it is not so hard to do it by yourself. Here is a bunch of submitted issues dealing with React integration. Most of them are for Jeeliz FaceFilter, but the problem is similar:

You can also take a look at these Github code repositories:

Native

It is possible to execute a JavaScript application using this library into a Webview for a native app integration. But with IOS < 14.3 the camera access is disabled inside webviews. If you want to make your application run on devices with IOS versions older than 14.3, you have to implement a hack to stream the camera video into the WKWebview using websockets.

His hack has been implemented into this repository:

But it is still a dirty hack introducing a bottleneck. It still run pretty well on a high end device (tested on Iphone XR), but it is better to stick on a full web environment.

Hosting

This library requires the user's camera feed through MediaStream API. Your application should then be hosted with a HTTPS server (the certificate can be self-signed). It won't work at all with unsecure HTTP, even locally with some web browsers.

Be careful to enable gzip HTTP/HTTPS compression for JSON and JS files. Indeed, the neuron network JSON in, /dist/ is quite heavy, but very well compressed with GZIP. You can check the gzip compression of your server here.

The neuron network JSON file is loaded using an ajax XMLHttpRequest after the user has accepted to share its camera. We proceed this way to avoid to load this quite heavy file if the user refuses to share its camera or if there is no camera available. The loading will be faster if you systematically preload the JSON file using a service worker or a simple raw XMLHttpRequest just after the loading of the HTML page. Then the file will be in the browser cache and will be fast to request.

About the tech

Under the hood

The heart of the lib is JEELIZFACEEXPRESSIONS. It is implemented by /dist/jeelizFaceExpressions.js script. It relies on Jeeliz WebGL Deep Learning technology to detect and track the user's face using a deep learning network, and to simultaneously evaluate the expression factors. The accuracy is adaptative: the best is the hardware, the more detections are processed per second. All is done client-side.

The documentation of JEELIZFACEEXPRESSIONS is included in this repository as a PDF file, /doc/jeelizFaceExpressions.pdf. In the main scripts of the demonstration, we never call these methods directly, but always through the helpers. Here is the indices of the morphs returned by this library:

  • 0: smileRight β†’ closed mouth smile right
  • 1: smileLeft β†’ closed mouth smile left
  • 2: eyeBrowLeftDown β†’ left eyebrow frowned
  • 3: eyeBrowRightDown β†’ right eyebrow frowned
  • 4: eyeBrowLeftUp β†’ raise left eyebrow (surprise)
  • 5: eyeBrowRightUp β†’ raise right eyebrow (surprise)
  • 6: mouthOpen β†’ open mouth
  • 7: mouthRound β†’ o shaped mouth
  • 8: eyeRightClose β†’ close right eye
  • 9: eyeLeftClose β†’ close left eye
  • 10: mouthNasty β†’ nasty mouth (show teeth)

Compatibility

  • If WebGL2 is available, it uses WebGL2 and no specific extension is required,
  • If WebGL2 is not available but WebGL1, we require either OES_TEXTURE_FLOAT extension or OES_TEXTURE_HALF_FLOAT extension,
  • If WebGL2 is not available, and if WebGL1 is not available or neither OES_TEXTURE_FLOAT or OES_HALF_TEXTURE_FLOAT are implemented, the user is not compatible.

In all cases, you need to have WebRTC implemented in the web browser, otherwise this library will not be able to get the camera video feed. The compatibility tables are on, caniuse.com: WebGL1, WebGL2, WebRTC.

If a compatibility error is triggered, please post an issue on this repository. If this is a camera access error, please first retry after closing all applications which could use your device (Skype, Messenger, other browser tabs and windows, ...). Please include:

This library works quite everywhere, and it works very well with a high end device like an Iphone X. But if your device is too cheap or too old, it will perform too few evaluations per second and the application will be slow.

Documentation

License

Apache 2.0. This application is free for both commercial and non-commercial use.

We appreciate attribution by including the Jeeliz logo and a link to the Jeeliz website in your application or desktop website. Of course we do not expect a large link to Jeeliz over your face filter, but if you can put the link in the credits/about/help/footer section it would be great.

References

More Repositories

1

jeelizFaceFilter

Javascript/WebGL lightweight face tracking library designed for augmented reality webcam filters. Features : multiple faces detection, rotation, mouth opening. Various integration examples are provided (Three.js, Babylon.js, FaceSwap, Canvas2D, CSS3D...).
JavaScript
2,662
star
2

jeelizAR

JavaScript object detection lightweight library for augmented reality (WebXR demos included). It uses convolutional neural networks running on the GPU with WebGL.
JavaScript
360
star
3

jeelizGlassesVTOWidget

JavaScript/WebGL glasses virtual try-on widget. Real-time camera experience, robust to all lighting conditions, high-end 3D PBR rendering, easy integration, fully customizable.
JavaScript
270
star
4

jeelizPupillometry

Real-time pupillometry in the web browser using a 4K webcam video feed processed by this WebGL/Javascript library. 2 demo experiments are included.
JavaScript
105
star
5

jeelizGlanceTracker

JavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
JavaScript
86
star
6

jetsonjs

Embed a JavaScript/WebGL application on a Nvidia Jetson TX2 and stream the results through websockets. It does not rely on CUDA/Jetpack. HDMI touchscreen, virtual keyboard, GPIO control, wifi config are included.
JavaScript
27
star
7

jeelizExposureController

JavaScript/WebGL library to control the camera exposure using ImageCapture API, according to a set average lightness.
JavaScript
9
star