The graphics industry is continually striving to achieve photorealistic visuals. While we’ve largely been able to capture realistic environments, creating believable faces is orders of magnitudes more challenging.
This is due to the fact that if a digital human looks even slightly off, they can enter what is known as the “uncanny valley,” making faces look eerie. That is why many developers often opt to create highly stylized faces, which circumvents the issue, but doesn’t solve it.
At Epic, we are tackling this difficult problem head-on. To do so, we had to get a deeper understanding of the human anatomy. We then built new tools that allowed us to create believable digital humans the likes of which the world had never seen in real time. Our Virtual Mike, Siren, and Andy Serkis demos exemplify our major graphical breakthroughs.
In a recent livestream, Senior Character Artist Adam Skutt demonstrated how we created Virtual Mike, which is available for you to download and experiment with through the Epic Games Launcher.
Improvements to the way we render skin represents one of the largest leaps we’ve made to advancing real-time digital humans. In the livestream, Skutt walks you through how we took a facial scan of researcher and writer Mike Seymour, crafted skin that realistically captures the roughness and oils of the human face and utilized two normal maps to drive detailed facial animations. Highlighting the updated Subsurface Profile Shading Model, which allows skin in Unreal Engine to more believably interact with light, Skutt shares how to solve the fake-looking “CG grey” shadow effect that typically plagues digital humans by also leveraging a Screen Space Irradiance post process material.
Advanced hair is created by using a mixture of individual splines coupled with innovative techniques that delve into how we blend follicles into the scalp. Finally, Skutt delves into how we create extremely convincing eyes.
The culmination of all these new rendering tools amounted to a digital face so realistic that, during the livestream, even Skutt had a hard time distinguishing the in-engine render from the real-life reference photo.
For an in-depth look on how we were able to create lifelike digital humans, make sure to watch the embedded livestream above. For additional information, dig deeper with our technical documentation and stay tuned for more content on the subject.
If you would like to experiment with our Digital Human engine feature sample yourself, make sure to download Unreal Engine 4.20 for free today and check it out through the Learn tab within the Epic Games Launcher.