More of the characters from Zafari. More of the characters from Zafari.
ToolsTV

Upcoming Animated Series ‘Zafari’ Is Being Rendered Completely With The Unreal Game Engine

Zafari is a cg animated children’s television series created by David Dozoretz in which animals are magically born with the skin of other creatures. The show is notable as the first animated television series being rendered entirely with Epic Games’ Unreal Engine.

Dozoretz, a previsualization supervisor whose company Persistence of Vision provided previs on such films as Star Trek, Super 8, and X-Men: First Class, developed the idea for Zafari several years ago. Digital Dimension was hired to make 52 eleven-minute episodes in Montreal. Various distributors around the world, including NBC Universal, have bought the series. It will air first on France.tv this November.

Cartoon Brew asked Dozoretz why he went with a real-time game engine approach, how that impacted the production, and the lessons he’s learned from tackling the project that way.

The origins of Zafari

Dozoretz says he always wanted to make Zafari, a project he had the idea for more than a decade ago, ‘differently.’ That would ultimately happen by embracing real-time rendering technology, but originally Dozoretz had considered GPU-based renderers such as Redshift and Octane, which he had used himself on some previs projects.

Digital Dimension, a company known originally for visual effects that has now pivoted to cg animation, ultimately embraced Unreal Engine for the production. “I said, ‘Okay, but here’s the list of the 15 hoops Unreal Engine has to jump through in order for me to say yes,’ and Unreal jumps through all of those 15,” recalled Dozoretz.

An Unreal Engine screenshot with Zafari in production.
An Unreal Engine screenshot with Zafari in production.

Those ‘hoops’ included enabling a lush jungle environment and complex camera moves through it, but without being overly expensive and time-consuming to render – something that would not work on a TV schedule or budget. “What Unreal let’s us do is allow us to get a near-Pixar look at television budgets and television speeds,” said Dozoretz. “In fact, my pitch to the studios was that we would give them more towards Pixar quality in a tv show than they’ve ever seen before.”

“Certainly,” added Dozoretz, “it’s not Pixar-quality in terms of it being a production budget of $150 million, but it looks pretty good. We’ve got global illumination and sub-surface scattering, and, because we’re in Unreal, we don’t have to do that thing that’s so frequent in animation, which is where they render the background as a still, and they render the foreground animation in composite, and you end up getting very 2d-like moves on things.”

Zafari’s production pipeline

Although the series is relying on Unreal Engine for rendering, Digital Dimension still follows a more traditional workflow for the front half of production. Characters are created and animated in Maya, for instance, but Unreal is used for lighting, effects, and rendering. Compositing is generally not required.

A final frame from the show.
A final frame from the show.

From a workflow perspective, Dozoretz approves animation (or sometimes an earlier layout or blocking stage), with Digital Dimension artists often already embarking on the lighting for that same animation scene.

“What’s so fascinating about this,” Dozoretz said, “is that they’re taking the entire lighting and rendering pipeline and moving it earlier and shortening it. I’ll see lighting start to be developed while layout and blocking are happening. Then, once we do have a version approved of the animation, I’ll start seeing a finished version of the episode like an hour later, or at least a first pass of it.”

This is, of course, the major benefit of using a game engine. There is no reliance on a render farm, which means shot iterations can be turned around extremely quickly. Actual final renders take around three frames a second, which, Dozoretz acknowledges is not true real-time.

“The reason it’s not real time is because of the disc I/O, which is actually slowing it down. Digital Dimension can do a real-time version of the show, as in, it can render in real-time while the scenes are playing out, but right now it renders out in about three or four hours.”

The impact of real-time

Zafari might not be completely produced in Unreal Engine, but Dozoretz says that using the game engine has provided the production with the major benefit of speed. Digital Dimension has been producing layout on an 11-minute episode in a week, animating in two weeks, and rendering in a week. That generates an entire episode just about every month.

“That is so fast that we’re not even able to completely use Unreal as much as we want to right now,” said Dozoretz. “What I’d really love to do is make adjustments to camera position a bit. I’d love to be able to move the camera a little bit to the right, say, so there’s an out-of-focus foreground leaf that makes the composition better. That’s the kind of stuff that can happen right now. And what’s going to be able to happen soon is being able to change the shot in even more drastic ways.”

The show includes global illumination, water sims, and sub-surface scattering.
The show includes global illumination, water sims, and sub-surface scattering.

In fact, that ability to change things ‘on-the-fly’ via the interactive aspects of a game engine is not something Zafari has entirely capitalized on, yet, according to Dozoretz. “I’m really looking forward to being able to completely do the camera moves in Unreal, like block out the scene in advance, and go back in and almost shoot it like you would shoot live action where they’d block out a scene, and then move the camera to make it work with the emotional beats of the story.”

“It’s a scenario where I’ll be able to have the entire scene, every angle available to me in the edit, and that’s kind of cool,” added Dozoretz. “What directing requires so much is previsualization and imagination. You need to be able to look and say, ‘Okay, once that character is moving correctly and saying this line, and it’s rendered and it looks right with the mood, then this is the right camera moment at this point for this long in order to elicit the proper understanding and emotional response in the viewer.’ You have to make that leap ahead, of what will it look like when it’s done and inside the entire edit.”

Dozoretz is adamant that real-time is the technology that will allow directors to get to that end result much faster, and still enable many iterations. But he is also conscious of the limitations that undertaking this new kind of ‘hybrid’ workflow entails, especially with animation.

Unreal Engine screenshot.
Unreal Engine screenshot.

“There is a very specific system that was set up within Unreal to be able to do their work in real-time, and it is not one that lends itself to changing animation,” said Dozoretz. “If you programmed it like a game, and the character turns right, and the character turns left, and you can drive it in real-time, that’s one way you could edit the animation in Unreal. But to actually have a really specific action that works for the storytelling of a shot, that’s pretty much custom-built. Unreal, at the moment, isn’t really set up for that.”

Real-time for the future

Although it relies on real-time rendering just like a video game, Zafari will be presented in the more traditional linear format of a television series. But could the fact that it can be rendered in real-time mean a different kind of presentation? For example, could it be presented in an interactive format where the user can move through a Zafari environment in a game or on an iPad, or even be able to re-edit scenes or change character details on the fly?

Dozoretz is not against these ideas, especially since they are possible with real-time, but he does not see that kind of interactive presentation as a replacement for a traditional television delivery system. “I don’t see that being a replacement for the true storytelling experience in which the audience and the filmmakers have this unspoken contract, where the filmmaker says, ‘Give me your attention for a certain period of time, and I’ll tell you something interesting.’ That’s a passive experience, and I don’t want to change that.”

More of the characters from "Zafari."
More of the characters from “Zafari.”
  • Jordann William Edwards

    Cool :)

    I think maybe if it becomes a success, perhaps they can expand into transmedia? They can still use the same pipeline too!

  • Andres Molina

    Currently, Pixar is developing a new version of rendering software, called Renderman XPU, a new type of hybrid image processing that renders images using both CPU & GPU concurrently at the same time, rather than just CPU or GPU alone. This kind of hybrid rendering can not only drastically speed up rendering time, but can also drastically reduce costs in rendering. V-ray as well is also releasing or has released a version of its rendering software that also supports hybrid rendering. Either way, hybrid rendering will soon be the brand new standard in image processing, which will in return provide vastly superior quality CGI at the same speed/cost as CPU/GPU alone, or equal quality images at lower costs and render times. Hopefully in the future, rendering technology and computers will be fast and advanced enough to provide Pixar-Quality visuals for television, but for now, real-time rendering is a good start.

  • Bramagola

    “`Digital Dimension still follows a more traditional workflow for the front half of production. Characters are created and animated in Maya, for instance, but Unreal is used for lighting, effects, and rendering. Compositing is generally not required.“`

    So, a non-story. Cool.

    Can we see the one about the show using Marmoset to render its frames?

    Yeah, oks.

    • rozan

      That made no sense. A huge chunk of work was done in Unreal. If you think it’s a non-story then you have no idea what you’re talking about.

      • Bramagola

        The huge truck that was done where the effects and Lighting. if you think that that is somehow a huge chunk of a pipeline then you don’t know what you’re talking about.

        • rozan

          Composition, lighting, effects, cameras, rendering. Were you expecting them to model stuff in UE? :)

    • Scott

      A big part of animation and VFX work is the massive render times. We have huge warehouses with render farms for it and it STILL takes a huge chunk of the sheduled time for the work. And costs MONEY!!

      Real time rendering is a big step forward.

  • Marc Hendry

    nice! I guess render farms will have to up their game fast.
    Or maybe realistic rendering will go out of style once it becomes easy/accessible.
    Also I feel like I’ve seen that elephant character design before, but without the zebra stripes

  • Alonso Soriano

    This really seems like the future, and suggests interesting cross media uses.
    What was the advantage of using Unreal over Unity? I’d love to know more about production, how much do you have to pretend to be making a game, how tricky is it to dump in canned animations and spit out video?

    here’s a similar article on a unity made cartoon
    https://blogs.unity3d.com/2017/02/23/mr-carton-the-worlds-first-cartoon-series-madewithunity/

    and this is a gorgeous looking short in unity
    https://madewith.unity.com/stories/road-to-gdc-sonder

  • Wow! Pretty awesome. It is great to see more content creators embrace the game engine as a tool for production. I believe this is the future of animation and movie production in general.

  • Nervatel

    The series Mr Cartoon was also made entirely using a game engine, it was a french cartoon that aired earlier this year, doesn’t that mean that Zafari isn’t first ?