This Sunday In LA: You’re Invited To An ‘Ultraman: Rising’ FYC Screening, Q&A, And Reception

Snow and fur: they’re two of the toughest things to pull off in computer graphics, and Sony Pictures Imageworks had to do both in a big way for Karey Kirkpatrick’s Smallfoot, from Warner Animation Group.

The cg-animated film, which turns the tables on the usual bigfoot tale, tells the story of a yeti named Migo (Channing Tatum) who stumbles across the ‘legendary’ humans, or smallfoots.

With many yetis to create in a mountainous snow setting, Imageworks had its work cut out. But the studio developed new tools, and took advantage of existing ones, to accomplish the snow and yeti fur. Imageworks effects supervisor Theo Vandernoont and character effects supervisor Henrik Karlsson tell Cartoon Brew how they did it, and share several image breakdowns.

Imageworks has had to conjure up snow for previous projects, but nothing quite like the complexity required for Smallfoot. Also, as Vandernoont describes, the filmmakers wanted the yeti world to feel cold without necessarily using breath. “To create this ‘cold,’ falling snow was used extensively throughout the film. For adding depth to environments, we used wind driven snow simulations in each environment to create moving blankets of atmospherics which were all cached, allowing lighters to select what they wanted to use from sequence to sequence, shot to shot.”

The main challenge, however, was being able to simulate the breaking of snow realistically, all the way from large-scale chunks of snowy ice, down to the powder which spills when snow crumbles. “We wanted to do deep snow interaction,” said Vandernoont, “not just ‘footprint’ style surface deformations.”

The answer to these challenges was a new suite of tools that Imageworks made in Side Effects Software’s Houdini called Katyusha. It was used to dynamically break up a surface into destruction-type chunks. Effectively, each of the chunks is filled with ‘grains’ (a Houdini solution for simulating particles), and the grains are released when the chunk receives a sizeable impact.

Storyboard frame.
Layout pass.
Animation pass.
Hair simulation pass.
Final shot with snow and hair, and final lighting.

“These millions of grains are internally instanced with snowy mini-shapes,” explained Vandernoont. “Then the system dices up the whole simulation, and via a distributed queue system, converts a ‘tile’ or regions of the mini-instances into volumes, which are sewn together in The Foundry’s Katana and rendered with a volumetric shader in Autodesk’s Arnold.”

Artists were also able to customize when chunks shatter, and prune others that are not needed, plus control how the ice and snow fragment apart.

Katyusha helped with snow breaking apart, but there was another part of the snow scenes in the film that also had to be solved; this was when snow needed to appear being ‘piled up.’ Imageworks designed a system that essentially dropped snow onto objects but without using any simulation. It also meant less time simply modeling the piled up snow in places.

A plane landing scene – this frame is the animation pass.
Snow simulation, the first step.
Further snow simulation.
Final shot.

Vandernoont outlines how it works: “The padding system runs a type of geometric ambient occlusion which determines what objects would get in the snow’s ‘way’ of piling up. Then the system builds a volumetric map of where the snow would have landed, and fills it in. It generates a lot of poly meshes, which are instanced in Katana. It took some months to R&D this, but in the end, it was capable of padding an area the size of a city block, with detail down to 10cm objects, in about five hours on the queue.”

For yeti fur – a white and thick full body covering of hair – Imageworks moved on from its current use of the nHair Maya solver to develop something further that could accommodate so many creatures. The studio actually integrated its dynamic hair systems into an existing cloth system to do this.

“Our cloth system had the ability to re-build on the fly, so we integrated our dynamic hair system into it,” said Karlsson. “This meant that we could build each character in pieces and test them one at a time as well as easily update the underlying meshes or curves without having to waste time rebuilding dynamic setups. It also allowed us to build our generic yetis in a way that we could mix and match their different pieces of grooms. Animation would define what head hair would be used and what chest hair as well as the overall look of the character. This would then be passed into our system which would build the dynamic setup for the character on the fly allowing a vast variation of characters.”

The different kind of fur shaders required for the characters.

The yetis had millions and millions of hairs, and that required a lot of rendering time. Imageworks found that once the hair density increased to a certain point, that it could take 30 minutes or more just in terms of ‘hair build time’ per frame. Sometimes it was even several hours per frame. That kind of render time was too long, and so a solution for this was also required.

“What we ended up doing instead,” detailed Karlsson, “was take the idea of our current system of calculating the camera distance from the hair but pre-baking some defined lower densities of hair which we would switch to based on the camera distance. This basically means that while we were still retaining the current system with the hair reducing in numbers as we go further away we would start at a much lower value in density.”

Things got even more complicated when there needed to be ‘hair on hair’ interaction. But Imageworks planned ahead for this. “Each clump of hair had at least one curve driving it,” said Karlsson. “This gives us enough fidelity to handle pretty much anything needed in terms of collisions. For example, Migo has over 1,500 curves being simulated on just his chest alone. The character Meechee has almost 4,000 in her shawl.”

Character fx simulation pass.
Final shot.

Perhaps what’s most interesting about the snow and effects work in Smallfoot is that these elements are so integral in the yeti story that they go by without any kind of major attention, a testament to the years of R&D and continued innovation at Imageworks via its many varied visual effects, and internal and external animation projects.