Face tracked to image | Recorded Face Tracking


#1

I love the new face tracking feature, but can’t help but request another one (or two :wink: )

Rather than use the face mesh to cover the user’s face, I would like to use the face mesh as an object in a ‘normal’ studio project. The user would be looking at a face/head that was attached to a tracking image, for example.

Ideally, this 3d face/head object could be animated by timelines that were recorded automatically when tracking a real face. The designer would simply use a (separate?) face tracking recording app that would track his/her own expressions and record them to be used in the project.


#2

Hi Mark,

Glad you’re enjoying the feature, and thanks for the requests. A full head mesh as an option is something that we have implemented and will be coming out in a future update.

Recording is also something we’ve thought a bit about how best to expose. Capturing the texture is one of the challenging parts - ideally it would be a single image, but from certain views not all of the mesh is necessarily visible, so a one shot faceInstance.snapshot(Z.ImageTexture) might not quite be sufficient. Also things like hair are obviously not included as part of the face mesh so it may look a bit disembodied (though that may be the vibe you’re going for!)

Anyway it is definitely something we’re thinking about, so thanks for the additional external validation that there is interest in such a feature.


#3

Hi Simon,

seems like you guys are thinking way ahead of what I was even hoping for! I hadn’t even considered recording the texture of the user/designer, but rather ‘just’ the position of the Face Attachment Points over time.

To give you a better idea of what I’m toying with in my mind: I’d like to make a 3D non-player-character ‘floating head’, that can talk. To animate the face the recording feature would be ideal; I could just speak the text myself (as designer) and replay the timeline synchronized with a sound file of a digitized voice. In my project the NPC is an AR/VR ‘entity’, so recording the texture won’t be necessary; that is a static facepaint and you have already implemented that!

I’m amazed to hear what major leaps you are planning! Face tracking, real world tracking… exciting stuff!