Home

Awesome

NOTICE: Apple<sup>©</sup>'s lawyers threatened us to file a complain on the 21th of August 2019 for infringing their intellectual property. So we have replaced the 3D animated fox by a raccoon.

Indeed, Apple<sup>©</sup> owns the intellectual property of 3D animated foxes (but not on raccoons yet). Thank you for your understanding.

JavaScript/WebGL library to detect and reproduce facial expressions

You can build your own animated emoticon embedded in your web application thanks to this library. The video is processed exclusively on the client-side.

The computing power of your GPU is important. If your GPU is powerful, many detections per second will be processed and the result will be smooth and accurate.

The face detection should work even if the lighting is not great. However, the better is the input image, the better is the face expressions detection. Here are some tips to get a good experience:

Table of contents

Features

Architecture

Demonstrations

All the following demos are included in this repository, in the /demos path. You can try them:

If you have made an application or a fun demonstration using this library, we would love to check it out and add a link here! Just contact us on Twitter @Jeeliz_AR or LinkedIn.

Run locally

You just have to serve the content of this directory using a HTTPS server. Camera access can be not authorized depending on the web browser the application is hosted by an unsecured HTTP server. You can use Docker for example to run a HTTPS server:

  1. Run docker-compose
docker-compose up
  1. Open a browser and go to localhost:8888

If you have not bought a camera yet, a screenshot video of the Cartman Demo is available here:

<p align="center"> <a href='https://www.youtube.com/watch?v=WxaL_kXwtRE'><img src='https://img.youtube.com/vi/WxaL_kXwtRE/0.jpg'></a> </p>

Using module

/dist/jeelizFaceExpressions.module.js is exactly the same as /dist/jeelizFaceExpressions.js except that it works as JavaScript module, so you can import it directly using:

import 'dist/jeelizFaceExpressions.module.js'

or using require:

const faceExpressions = require('./lib/jeelizFaceExpressions.module.js')
//...

There is no demo using the module version yet.

Integration

With a bundler

If you use this library with a bundler (typically Webpack or Parcel), first you should use the module version.

Then, with the standard library, we load the neural network model (specified by NNCPath provided as initialization parameter) using AJAX for the following reasons:

With a bundler, it is a bit more complicated. It is easier to load the neural network model using a classical import or require call and to provide it using the NNC init parameter:

const faceExpressions = require('./lib/jeelizFaceExpressions.module.js')
const neuralNetworkModel = require('./dist/jeelizFaceExpressionsNNC.json')

faceExpressions.init({
  NNC: neuralNetworkModel, //instead of NNCPath
  //... other init parameters
});

With JavaScript frontend frameworks

We don't cover here the integration with mainstream JavaScript frontend frameworks (React, Vue, Angular). If you submit Pull Request adding the boilerplate or a demo integrated with specific frameworks, you are welcome and they will be accepted of course. We can provide this kind of integration as a specific development service ( please contact us here ). But it is not so hard to do it by yourself. Here is a bunch of submitted issues dealing with React integration. Most of them are for Jeeliz FaceFilter, but the problem is similar:

You can also take a look at these Github code repositories:

Native

It is possible to execute a JavaScript application using this library into a Webview for a native app integration. But with IOS < 14.3 the camera access is disabled inside webviews. If you want to make your application run on devices with IOS versions older than 14.3, you have to implement a hack to stream the camera video into the WKWebview using websockets.

His hack has been implemented into this repository:

But it is still a dirty hack introducing a bottleneck. It still run pretty well on a high end device (tested on Iphone XR), but it is better to stick on a full web environment.

Hosting

This library requires the user's camera feed through MediaStream API. Your application should then be hosted with a HTTPS server (the certificate can be self-signed). It won't work at all with unsecure HTTP, even locally with some web browsers.

Be careful to enable gzip HTTP/HTTPS compression for JSON and JS files. Indeed, the neuron network JSON in, /dist/ is quite heavy, but very well compressed with GZIP. You can check the gzip compression of your server here.

The neuron network JSON file is loaded using an ajax XMLHttpRequest after the user has accepted to share its camera. We proceed this way to avoid to load this quite heavy file if the user refuses to share its camera or if there is no camera available. The loading will be faster if you systematically preload the JSON file using a service worker or a simple raw XMLHttpRequest just after the loading of the HTML page. Then the file will be in the browser cache and will be fast to request.

About the tech

Under the hood

The heart of the lib is JEELIZFACEEXPRESSIONS. It is implemented by /dist/jeelizFaceExpressions.js script. It relies on Jeeliz WebGL Deep Learning technology to detect and track the user's face using a deep learning network, and to simultaneously evaluate the expression factors. The accuracy is adaptative: the best is the hardware, the more detections are processed per second. All is done client-side.

The documentation of JEELIZFACEEXPRESSIONS is included in this repository as a PDF file, /doc/jeelizFaceExpressions.pdf. In the main scripts of the demonstration, we never call these methods directly, but always through the helpers. Here is the indices of the morphs returned by this library:

Compatibility

In all cases, you need to have WebRTC implemented in the web browser, otherwise this library will not be able to get the camera video feed. The compatibility tables are on, caniuse.com: WebGL1, WebGL2, WebRTC.

If a compatibility error is triggered, please post an issue on this repository. If this is a camera access error, please first retry after closing all applications which could use your device (Skype, Messenger, other browser tabs and windows, ...). Please include:

This library works quite everywhere, and it works very well with a high end device like an Iphone X. But if your device is too cheap or too old, it will perform too few evaluations per second and the application will be slow.

Documentation

<!-- * [Create your own animated emoticon for the web](https://jeeliz.com/blog/create-animojis-for-the-web/) * [Integrate the animated emoticon on your website](https://jeeliz.com/blog/add-a-weboji-on-website/) -->

License

Apache 2.0. This application is free for both commercial and non-commercial use.

<!-- We appreciate attribution by including the [Jeeliz logo](https://jeeliz.com/wp-content/uploads/2018/01/LOGO_JEELIZ_BLUE.png) and a link to the [Jeeliz website](https://jeeliz.com) in your application or desktop website. Of course we do not expect a large link to Jeeliz over your face filter, but if you can put the link in the credits/about/help/footer section it would be great. -->

References