"Alien Xmas" "Alien Xmas"

Welcome to Tools Of The Trade, a series in which industry artists and filmmakers speak about their preferred tool on a recent project — be it a digital or physical tool, new or old, deluxe or dirt-cheap.

This week, we speak to Edward and Stephen Chiodo, two parts of the trio known as the Chiodo Brothers. Special-effects veterans with deep experience in puppetry and stop motion, the brothers have contributed to films including Robocop, Pee-wee’s Big Adventure, Elf, and Team America: World Police; their tv credits include five episodes of The Simpsons. Their latest work, the stop-motion holiday special Alien Xmas, was released on Netflix in November.

Edward and Stephen have chosen to discuss DZED Systems’ Dragonframe, the pre-eminent stop-motion software. Over to them:

Chiodo Brothers
Edward and Stephen Chiodo

We can’t remember when exactly we started using Dragonframe — probably somewhere around 2008–09. Out of the gate, it was pretty great. The program covered all the basic needs to do digital stop-motion animation. Over time, it has increased its list of compatible cameras, enhanced cinematography capabilities, motion-control interface, DMX control, multiple-frame shooting, automated 3d capture, and sophisticated post-frame editing and re-sequencing. We’re sure there are other enhancements. It’s become a very deep stop-motion studio package.

So deep that it’s easy to overlook all the aspects we now take for granted. For Alien Xmas, the ability to program multiple-frame capture in conjunction with the DMX control for our front-light/back-light stages was huge. The ability for the animator to press one button and have the program turn the proper lights and on and off and take the frames automatically is a big time-saver, and greatly reduces the chances of mistakes or forgotten elements. We also used it for multiple complicated lighting effects that could only be accomplished by shooting multiple passes on each frame.

Prior to switching over to digital camera, we would use video assist cameras on our film cameras. This would allow us to view the live frame on a monitor, and by drawing on it, we were able to track and chart the desired movement. Crude — but it was extremely helpful in seeing the performance emerge.

We then added a video switcher which allowed us to capture and store a frame, and dissolve between the live frame and that stored frame. It was a huge leap forward to be able to see that progression between the frames.

The next evolution was [Animation Toolworks’] Video Lunchbox, which allowed us to capture and store all the frames as they were taken. Not only could you toggle between the live frame and the last stored frame, you could play back the entire shot while in progress and step through as needed. That ability was huge if you needed to cut back into a shot and make a change without having to restart from the beginning. The only drawback with the Lunchbox was that it couldn’t capture at a resolution that was suitable for high-resolution or feature work.

"Alien Xmas"
Above and below: the “Alien Xmas” shoot (Netflix / © 2020)
"Alien Xmas"

Dragonframe has combined all those features and added the direct capture function. We think it has risen to its premier position largely because it was created by professionals working in the field of stop-motion animation: animators, directors of photography, and software developers with first-hand knowledge of what it takes to create animation. It continues to increase its capability by listening to its users and incorporating their ideas, and by keeping up to date with the new emerging technologies on the camera and hardware side of things.

The software is usually ahead of the curve on improvements, but we always wanted a higher-resolution preview image. On Alien Xmas, we were fortunate to work directly with Canon and Dragonframe as beta testers on a firmware update that allowed the preview to be at HD resolution; there were also some other improvements to the camera control features. It works great — a definite upgrade. Now if we could only get it to actually animate the shot for us.

Dragonframe seems optimized for Canon DSLRs, but it works with a variety of other cameras as well. We don’t think there is a major manufacturer that doesn’t have some sort of compatibility with the software. It works with both consumer and professional cameras. On Alien Xmas, we used the Canon EOS R body with Nikon prime lenses.

"Alien Xmas"
Above and below: concept art for “Alien Xmas” (interior above by Christina Yang) (Netflix / © 2020)
"Alien Xmas"

In general, digital tools are a time saver. When we shot on film, we’d do the set-up, animate the shot plus any additional or multiple passes, then send the film off to the lab and wait to see to see the results the next day (if we were lucky enough to get it to the lab on time). With digital, we get to review a shot while it’s in progress. We get instant gratification or despair if the shot isn’t working, but we know immediately if we need to reshoot or move on.

In the past, we would be lucky to get one or two shots a day if they were from the same camera position. Now we can comfortably move to a new set-up, confident we have the shot. Also, you’re originating the material digitally, which brings you into an incredibly deep world of processing the footage in post, whether it be rig removal, compositing, or editing. Everything now lives in that digital world, without any loss of quality.

Software/Tech:

Alex Dudok de Wit

Alex Dudok de Wit is Deputy Editor of Cartoon Brew.

Latest News from Cartoon Brew