banner



How To Start Adobe Character Animator Preview 4

When a live Simpsons segment was announced several weeks ago, many speculated about how it would be achieved. Would it be via movement capture? Peradventure a markerless facial blitheness set-up?

Ultimately, the three-minute segment, in which Homer (voiced past Dan Castellaneta) answered live questions submitted by fans, was realized with the help of still-in-development Adobe Graphic symbol Animator controlling lip sync and keyboard-triggered animations calculation to the mix. Drawing Brew got all the tech details from The Simpsons producer and director David Silverman and Adobe's senior master scientist for Character Animator and co-creator of Later on Effects, David Simons.

But first, here's the live segment:

The origins of a live Simpsons

The thought for a live-animated segment had been around for several years, co-ordinate to Silverman, who noted that the thought was to take reward of Castellaneta's ad-libbing skills. "Nosotros all know that Dan is a groovy improv guy. He came from Second City in Chicago, where comedians similar Pecker Murray and John Belushi had also performed." However, information technology was not so clear what technology could be used to produce a live broadcast. That is, until The Simpsons team observed how the Play tricks Sports on-air graphics segmentation was implementing the alive manipulation of its robot mascot, Cleatus. That led to an investigation of Adobe Character Animator.

Still a relatively new feature in Afterwards Effects CC, Character Animator is designed to breathing layered 2D characters made in Photoshop CC or Illustrator CC past transferring real human being actions into animated form. This tin be via keystrokes, but the real drawcard of the tool is the translation via webcam of user facial expressions to a second character and user dialogue driving lip sync.

David Simons.
David Simons.

Facial blitheness was not used in the live Simpsons segment, but lip sync direct from Castellaneta's operation was. The lip sync part works by analyzing the sound input and converting this into a series of phonemes. "If you take the discussion 'map'," explained Adobe's David Simons, "each letter in the word would be an private phoneme. The last step would be displaying what we're calling 'visemes'. In the 'map' instance, the 'm' and 'p' phonemes can both be represented by the same viseme. We back up a total of 11 visemes, merely we recognize many more than (60+) phonemes. In short, if you create mouth shapes in Photoshop or Illustrator and tag them accordingly in Character Animator, you lot tin can animate your oral cavity by only talking into the microphone."

Interestingly, when The Simpsons team were looking to adopt Character Animator for the alive segment, the tool was at the fourth dimension, and still is, in preview release grade (currently Preview 4). But The Simpsons team were able to work with Fox Sports to produce a prototype Homer puppet in the software that convinced anybody that a live Simpsons segment would be possible. "To ensure that the Simpsons squad was using a very stable product," said Simons, "we created a new branch of Preview 4 called 'Springfield' with the version number starting at x847 because that'south the toll Maggie rings up in the show's intro. We knew that good lip sync would be a priority so a lot of work went into adjusting our lip sync algorithm then the cease result would be broadcast quality."

Making animation

During the live segment – recorded twice for westward and due east coast viewers of the show – Castellaneta was situated in an isolated sound booth at the Fox Sports facility listening and responding to callers while Silverman was called upon to operate the extra blitheness with a custom XKEYS keypad device that included printed animated Homer thumbnail icons. Adobe besides implemented a way to send the Grapheme Animator output straight as a video signal via SDI and enable the alive broadcast.

So, why was Silverman tasked with pressing the buttons? "They wanted me to work the animation because of my familiarity," the director, who has worked on the show nearly from solar day one, best-selling. "I'm the guy who invented a lot of the rules for Homer [and] they always expect to me every bit a Homer expert. So they thought it would be a skilful thought to have somebody who knew how the character sounded and worked."

Of form, before the broadcast, the animatable pieces had to be assembled. This was done in Photoshop past The Simpsons animation team, then translated to Grapheme Animator. "One of our animation directors, Eric Koenig, gear up the animation stems that would be used," said Silverman. "We had Homer speaking, all the dialogue mouths, the layout of the room and the animation of Homer raising his artillery, turning side to side, centre blinks, et cetera. Eric Kurland so prepare upwardly the set the programming for information technology past working with Adobe on all the buttons and rigging of the graphic symbol."

David Silverman posted this image of the keypad he used to control pre-animated elements for The Simpsons live segment.
David Silverman posted this image of the keypad he used to control pre-animated elements for The Simpsons live segment.

A range of animations was developed but non necessarily used in the last alive operation. "We had a D'oh! and a Woohoo!," noted Silverman, "but because Dan was advert-libbing it seemed to me it was going to exist unlikely he would be doing those catch phrases. And we had one button that had one very specific piece of animation when Homer said, 'Simply kidding', because that was a pre-written part of the script where he said, 'Just kidding, the Simpsons will never die."

"And so there were the special animations where you press a button and, say, Lisa walks in," added Silverman. "Originally I was pressing the buttons to cue all these people but in the stop we had them come in at very specific points. Also, originally the cut from a wide shot to close-up was beingness done by a different managing director, simply then one of our producers suggested doing that automatically as you printing the button. This was ameliorate to focus all my attention on Dan's performance as Homer."

David Silverman.
David Silverman.

Silverman rehearsed with the keypad set-upwardly and with Dan, who was on a half-second delay, a couple of times before the circulate. "People have been asking me what are the buttons that are covered upwards in the photo of the keypad I published. I just covered upwards the buttons I wasn't going to utilise. There were a lot of all these buttons for the characters walking in but they were unnecessary because we had that on automatic. There were some other buttons that were on there that I just didn't recollect I would utilise."

Asked whether he was nervous during the alive broadcast, Silverman said he "didn't have any worries well-nigh it. Anybody else was more worried than I was! It might be considering I am a part-time musician and have no worries beingness on stage. I have a expert sense of timing from my musical performances, especially playing bass on the tuba which ways keeping a steady beat. Dan and I have also known each other for decades now and I had a good sense of how he would approach it."

The time to come of live animation

Clearly there are a host of tools bachelor for live animation correct at present, from gaming engines to set-ups that enable existent-time markerless facial blitheness such as that used in Graphic symbol Animator.

Adobe'south Simons said more is being washed in this area for the software. "Originally in Grapheme Animator we only had the power to control the head's position, rotation, and calibration using the photographic camera. Nosotros then added the ability to expect left, right, up, and downward, assuming you have the artwork drawn to match. There's a lot of room for innovation hither. We could practice clever things with parallax and who knows what user requests volition come up in. We do get a lot of questions on full torso capture, depth cameras, and other input devices."

A screenshot of the Adobe Character Animator interface.
A screenshot of the Adobe Character Animator interface.

Adobe is continuing to develop other features of Graphic symbol Animator, also. The current Preview 4 includes improved rigging capabilities where previously that aspect had to be set-upward in Illustrator or Photoshop. "We've added a feature chosen 'tags' which allows you lot to select a layer in Character Animator and tag it with an existing tag proper noun," said Simons. "Nosotros likewise have a new behavior called Motion Trigger. This beliefs will trigger an animation based on the grapheme's movement. In that location are still some fundamental pieces we have to evangelize such as an end-to-end workflow and further integration, such as with After Furnishings and Adobe Media Encoder. We'd as well like to increase the interoperability of the products for people who want to practice recorded animation."

For his function on The Simpsons live segment, Silverman was pleased with the results. "At that place were a couple of head-turns that peradventure included a grinning that Homer wouldn't normally practice, merely overall I think it was splendid," he said. "Perhaps if I had had more practice I would have been a little more, animated, shall we say."

Source: https://www.cartoonbrew.com/tech/simpsons-used-adobe-character-animator-create-live-episode-139775.html

Posted by: haassobsell.blogspot.com

0 Response to "How To Start Adobe Character Animator Preview 4"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel