MetaHuman Animator (plus an iPhone) could transform content creation

 MetaHuman Animator; people in a mocap studio
MetaHuman Animator; people in a mocap studio

MetaHuman Animator from Epic Games is about to change how everything is created, and it's out now. If that sounds like hyperbole, it could actually be understating just how important this new performance capture tech will be; creating a lifelike facial animation of an actor's performance in minutes, on an iPhone, is now possible. I'm impressed.

Epic Games first teased MetaHuman Animator at GDC 2023 with a demo featuring Ninja Theory's Melina Juergens, for the game Senua’s Saga: Hellblade II, performing and capturing live on stage. It looked incredible, and now it's live and ready for us all to tinker with. At the same show the devs behind Lords of the Fallen also revealed how easy it was to capture and put people into their game, built in Unreal Engine 5.

MetaHuman Animator is an innovative new tool set that allows you to capture an actor's performance using either an iPhone or a stereo head-mounted camera, and then apply this as realistic facial animation to any MetaHuman character – no manual fiddling required. MetaHuman Animator is designed to streamline the capture process and enable small teams and even individuals to create movie-quality CG performances.

MetaHuman Animator; an actor in a CG cinematic
MetaHuman Animator; an actor in a CG cinematic

In the promotional video released to launch MetaHuman Animator I can see the most delicate expression or subtle emotion faithfully reproduced. And it's about as easy as pointing a camera and pressing record. Sure, there's a little more to it, but Epic Games insists beginners as well as pros will be up and running in MetaHuman Animator with ease.

You can read more in-depth details about how MetaHuman Animator works at the Epic Games blog,  which reveals how it makes clever use of a 4D solver that combines video and depth data with a the app's representation of the performer and the resulting animation is rendered in minutes. You can make artistic adjustments effortlessly afterwards to fine-tune the animation.

A short film called Blue Dot has also been released to demo just what MetaHuman Animator is capable of; Epic Games' 3Lateral team worked with Serbian artists, including the renowned actor Radivoje Bukvić, to put MetaHuman Animator through its paces. Impressively, the team attained this impressive level of animation quality with minimal tweaks.

MetaHuman Animator; devs and an actor in a mocap studio
MetaHuman Animator; devs and an actor in a mocap studio

You don't need an expensive studio and a Hollywood actor to make your projects come to life (though it helps). A key part of MetaHuman Animator is how it works using the camera on your iPhone. The aim is to democratise facial performance capture and make it approachable for, well, everyone. (You will need one of the best laptops for game development to run Unreal Engine 5.)

The speed and ease of MetaHuman Animator fits into Epic Games push for accessible content creation – you can essentially create a scene in-engine and reshoot in minutes if needed. Read our Unreal Engine 5 review to see why this 3D platform impresses, or how Unreal Editor for Fortnite is putting game development tools in the hands of anyone who can hold a gamepad.

If you need to see more, visit the MetaHuman Animator YouTube channel and watch the new Aaron Sims Creative short with Ivan Šijak and you can get a further look at how it all works in the official MetaHuman Animator launch video.