Home

Awesome

ARKit Facial Controller / Recorder

thumbnail

ZigSim Controller

We can control character's blendshape with ZigSim OSC signals.

ARKit Facial Control script

First, we have to register mappings between ARKit's blendshape and Character's blendshapes. Attatch a ARKitFacialControl.cs to arbitrary object.

ARKitFacialControl

And next, set a SkinnedMeshRenderer component you want to drive with ARKit and push update button. Then in the Blend Shape Mappting folding field, you should set all blendshape mappings and strength with each pulldown and float field.

ZigSim Facial Control script

Second, we have to attatch a ZigSimFacialControl.cs to arbitrary object to communicate with ZigSim using OSC.

ZigSimFacialControl

Set a open port number and ARKit Facial Control created in previous section. Text is for debbung use.

Finally we can drive with character's blendshape with ZigSim. More information must be found in RecorderScene.unity.

Recorder

ZigSimFacialControl.cs has a recording functionality. To record your facial expression, simply call Record Start/Stop function externally.

In the sample scene, I called these functions with uGUI button components. Record

Recorded file (.byte) will be generated in user's Desktop when stop recording.

Player

We can drive facial expression with recorded data.

First, we need ARKitFacialControl component described in [ARKit Facial Control script section](# ARKit Facial Control script).

Player

Attatch a FaceRecordDataReader.cs and fill Facial Control field with ARKitFacialControl component we created. And add recorded bytes files to your unity project, we can put these asset in Asset field.

Once you chosed bytes file you want to playback, drag drop this GameObject to your timeline. Control Track will be automatically created and we can playback with moving timeline cursor.

Timeline

LICENSE

This project is distributed as MIT License.

But we use some package in this project, so please see each licence.md files.