Lower Face Expression and Movement Tracker

Worked on a deep learning computer vision project to track lower facial expressions and movement. The system takes in a video feed of the face below the nose from a VR headset and generate 3DMM (3D facial model) parameters to input to rendering engine to reproduce the wearer’s facial expressions and movement.

Demo (Not Perfect)