Today, Epic is introducing a brand-new tool that can quickly and easily attach an actor’s facial performance to a hyper-realistic “MetaHuman” in the Unreal Engine using a device as basic as an iPhone. Developers may now test out the functionality, known as MetaHuman Animator after it was introduced at the Game Developers Conference in March. In order to demonstrate the capabilities of the tool, Epic today also released a brand-new video that was created by one of its own teams.
Although the facial animation in Epic’s short film is really subtle, the main advantage that the company is stressing about is how fast MetaHuman Animator delivers results. According to the company’s news announcement, “The animation is produced locally using GPU hardware, and the final animation is available in minutes.” Because performance capture may be done more effectively, a studio might be able to save money as well as explore and be more imaginative, according to Epic.
“Need an actor to give you more, dig into a different emotion, or simply explore a new direction?” Epic’s press release asks. “Have them do another take. You’ll be able to review the results in about the time it takes to make a cup of coffee.” According to Epic, facial animation can be added to a MetaHuman character “in just a few clicks,” and the technology is clever enough to animate a character’s tongue in response to sounds from a performance.
Since the release of Epic’s Live Link Face iOS app in 2020, performance capture via iPhones has been possible in the Unreal Engine, but now it’s integrated with the high level of detail promised by Epic’s MetaHuman technology. In addition to operating on iPhones 12 and up, which can record both video and depth data, Epic claims that MetaHuman Animator can also be used with “existing vertical stereo head-mounted camera [systems] to achieve even greater fidelity.”
The Blue Dot short film, which was just released, according to Epic, should give some notion of what its animation tool is capable of. It was created by the 3Lateral team of Epic Games, and actor Radivoje Bukvi reads a monologue based on Mika Anti’s poetry. Epic claims “minimal interventions” were made on top of the MetaHuman Animator’s performance capture to achieve these results, despite the fact that it is possible to modify animation after capture.
Epic has made a video tutorial on how to use the tool available for enthusiasts who wish to learn more. Additionally, documentation is accessible through the MetaHuman hub of the Epic Developer Community.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.