using UnityEngine; namespace Normal.Realtime.Examples { public class VoiceMouthMove : MonoBehaviour { public SkinnedMeshRenderer[] skinned; private RealtimeAvatarVoice _voice; private float _mouthSize; void Awake() { _voice = GetComponent(); } void Update() { if (skinned != null) //not sure whether work or not { float targetMouthSize = Mathf.Lerp(0.1f, 1.0f, _voice.voiceVolume); _mouthSize = Mathf.Lerp(_mouthSize, targetMouthSize, 30.0f * Time.deltaTime); foreach (SkinnedMeshRenderer s in skinned)//not sure whether work or not { s.SetBlendShapeWeight(41, _mouthSize * 100); s.SetBlendShapeWeight(17, _mouthSize * 50); s.SetBlendShapeWeight(6, _mouthSize * 20); } } } } }
top of page

EMOTION MUSEUM - GOLDEN MASK

Made With

Unity-Logo.png

[Augmented Reality Project]

Mixed Reality Assignment #2

at Goldsmiths University of London

2021 - 2022

With the current technology, Virtual Production (VP) plays a more and more important role in the filming production process. One piece of important foundation within the XR technology is facial capture. I wanted to learn how to do the proper facial capture when I start using the AR filter a long time ago. To trigger the front camera and facial movement is not too hard, but I was thinking about how to interact with not only the front camera but also the back camera or use both cameras simultaneously. So I can place my facial capture result into reality via various AR functions. Luckily, I found a sample project on the Apple website that shows the potential of the hardware now supporting both cameras working together, which I wanted to make become the core tech design for my project. Combined the 2 key AR understanding concepts:

  • Expression/emotions Real-time Interaction

  • Spacial World Display and Interaction

By using the back camera to display reality physics space, meanwhile, obtaining the physical tracking position in the real world space.

Developmen Artwork
  • Facebook
  • Twitter
  • LinkedIn

©2021 by Mianzi Ltd

bottom of page