This Sunday In LA: You’re Invited To An ‘Ultraman: Rising’ FYC Screening, Q&A, And Reception

In the ancient days, people had to animate by moving heavy pencils around cumbersome sheets of paper. Then, digital-puppeted animation made the process simpler by having animators repeatedly click a mouse. Now, a new Adobe program allows you to control and animate two-dimensional characters using just your microphone, your webcam, and your face. No more heavy-duty pencil-lifting or repetitive mouse-clicking required to create the illusion of life.

Adobe Character Animator, the latest in the company’s Creative Cloud suite of applications, will soon be available in a preview version. The software (which was developed under the codename Project Animal) allows users to track simple head movements and facial expressions—including blinking, glancing, and eye-brow raising—and reproduce them instantaneously on any suitably layered character created in Photoshop or Illustrator. (Adobe claims that if the layers are named to indicate which body part they correspond to, then no additional rigging will be required and users can immediately control the puppet with their face.)

The program can also analyze a vocal performance to generate automatic lip sync, a feature that is included in competitive software, like certain Toon Boom programs.

Adobe is hoping the program will aid both novice artists who want to create character animation of their two-dimensional artwork, and pros who want to rig complex characters without creating a confusing tangle of After Effects expressions.

See the program in action in the video below.

“Character Animator makes it incredibly easy to bring life-like behavior figures and insert them into scenes including other actions like wind or snow,” Adobe product manager Todd Kopriva said on the Adobe blog. “I have never had more fun with a piece of software.”

The program also allows users to incorporate a number of other automatic behaviors, including swaying or dangling body parts (use your imagination) and breathing, while other body parts can be controlled with your keyboard or mouse. Character performances can be recorded and edited.

The program preview will arrive in the next After Effects CC update. Adobe is encouraging users to provide feedback on the program for ideas on how to develop it further.

(Thanks, Rob Kohr)