Zafari is a cg animated children’s television series created by David Dozoretz in which animals are magically born with the skin of other creatures. The show is notable as the first animated television series being rendered entirely with Epic Games’ Unreal Engine.
Dozoretz, a previsualization supervisor whose company Persistence of Vision provided previs on such films as Star Trek, Super 8, and X-Men: First Class, developed the idea for Zafari several years ago. Digital Dimension was hired to make 52 eleven-minute episodes in Montreal. Various distributors around the world, including NBC Universal, have bought the series. It will air first on France.tv this November.
Cartoon Brew asked Dozoretz why he went with a real-time game engine approach, how that impacted the production, and the lessons he’s learned from tackling the project that way.
The origins of Zafari
Dozoretz says he always wanted to make Zafari, a project he had the idea for more than a decade ago, ‘differently.’ That would ultimately happen by embracing real-time rendering technology, but originally Dozoretz had considered GPU-based renderers such as Redshift and Octane, which he had used himself on some previs projects.
Digital Dimension, a company known originally for visual effects that has now pivoted to cg animation, ultimately embraced Unreal Engine for the production. “I said, ‘Okay, but here’s the list of the 15 hoops Unreal Engine has to jump through in order for me to say yes,’ and Unreal jumps through all of those 15,” recalled Dozoretz.
Those ‘hoops’ included enabling a lush jungle environment and complex camera moves through it, but without being overly expensive and time-consuming to render – something that would not work on a TV schedule or budget. “What Unreal let’s us do is allow us to get a near-Pixar look at television budgets and television speeds,” said Dozoretz. “In fact, my pitch to the studios was that we would give them more towards Pixar quality in a tv show than they’ve ever seen before.”
“Certainly,” added Dozoretz, “it’s not Pixar-quality in terms of it being a production budget of $150 million, but it looks pretty good. We’ve got global illumination and sub-surface scattering, and, because we’re in Unreal, we don’t have to do that thing that’s so frequent in animation, which is where they render the background as a still, and they render the foreground animation in composite, and you end up getting very 2d-like moves on things.”
Zafari’s production pipeline
Although the series is relying on Unreal Engine for rendering, Digital Dimension still follows a more traditional workflow for the front half of production. Characters are created and animated in Maya, for instance, but Unreal is used for lighting, effects, and rendering. Compositing is generally not required.
From a workflow perspective, Dozoretz approves animation (or sometimes an earlier layout or blocking stage), with Digital Dimension artists often already embarking on the lighting for that same animation scene.
“What’s so fascinating about this,” Dozoretz said, “is that they’re taking the entire lighting and rendering pipeline and moving it earlier and shortening it. I’ll see lighting start to be developed while layout and blocking are happening. Then, once we do have a version approved of the animation, I’ll start seeing a finished version of the episode like an hour later, or at least a first pass of it.”
This is, of course, the major benefit of using a game engine. There is no reliance on a render farm, which means shot iterations can be turned around extremely quickly. Actual final renders take around three frames a second, which, Dozoretz acknowledges is not true real-time.
“The reason it’s not real time is because of the disc I/O, which is actually slowing it down. Digital Dimension can do a real-time version of the show, as in, it can render in real-time while the scenes are playing out, but right now it renders out in about three or four hours.”
The impact of real-time
Zafari might not be completely produced in Unreal Engine, but Dozoretz says that using the game engine has provided the production with the major benefit of speed. Digital Dimension has been producing layout on an 11-minute episode in a week, animating in two weeks, and rendering in a week. That generates an entire episode just about every month.
“That is so fast that we’re not even able to completely use Unreal as much as we want to right now,” said Dozoretz. “What I’d really love to do is make adjustments to camera position a bit. I’d love to be able to move the camera a little bit to the right, say, so there’s an out-of-focus foreground leaf that makes the composition better. That’s the kind of stuff that can happen right now. And what’s going to be able to happen soon is being able to change the shot in even more drastic ways.”
In fact, that ability to change things ‘on-the-fly’ via the interactive aspects of a game engine is not something Zafari has entirely capitalized on, yet, according to Dozoretz. “I’m really looking forward to being able to completely do the camera moves in Unreal, like block out the scene in advance, and go back in and almost shoot it like you would shoot live action where they’d block out a scene, and then move the camera to make it work with the emotional beats of the story.”
“It’s a scenario where I’ll be able to have the entire scene, every angle available to me in the edit, and that’s kind of cool,” added Dozoretz. “What directing requires so much is previsualization and imagination. You need to be able to look and say, ‘Okay, once that character is moving correctly and saying this line, and it’s rendered and it looks right with the mood, then this is the right camera moment at this point for this long in order to elicit the proper understanding and emotional response in the viewer.’ You have to make that leap ahead, of what will it look like when it’s done and inside the entire edit.”
Dozoretz is adamant that real-time is the technology that will allow directors to get to that end result much faster, and still enable many iterations. But he is also conscious of the limitations that undertaking this new kind of ‘hybrid’ workflow entails, especially with animation.
“There is a very specific system that was set up within Unreal to be able to do their work in real-time, and it is not one that lends itself to changing animation,” said Dozoretz. “If you programmed it like a game, and the character turns right, and the character turns left, and you can drive it in real-time, that’s one way you could edit the animation in Unreal. But to actually have a really specific action that works for the storytelling of a shot, that’s pretty much custom-built. Unreal, at the moment, isn’t really set up for that.”
Real-time for the future
Although it relies on real-time rendering just like a video game, Zafari will be presented in the more traditional linear format of a television series. But could the fact that it can be rendered in real-time mean a different kind of presentation? For example, could it be presented in an interactive format where the user can move through a Zafari environment in a game or on an iPad, or even be able to re-edit scenes or change character details on the fly?
Dozoretz is not against these ideas, especially since they are possible with real-time, but he does not see that kind of interactive presentation as a replacement for a traditional television delivery system. “I don’t see that being a replacement for the true storytelling experience in which the audience and the filmmakers have this unspoken contract, where the filmmaker says, ‘Give me your attention for a certain period of time, and I’ll tell you something interesting.’ That’s a passive experience, and I don’t want to change that.”