HomeDisney NewsPixar Used AI In Elemental To Bring Fire To Life

Pixar Used AI In Elemental To Bring Fire To Life

Published on

spot_img


With all the concerns surrounding the usage of AI replacing jobs in the animation industry, that concern doesn’t appear be shared by those in charge. With Pixar’s latest movie, it was decided to use an AI tool in order to get the sort of movement and design they were going for in the creation of the movie. More specifically, the characters.

Director of Elemental Peter Sohn’s fire and water characters were quite literally “fire” and “water”. He did not want them to look like a fiery blob of Calcifer from Studio Ghibli’s Howls Moving Castle, or Nicolas Cage’s Ghost Rider being on fire. The current tools they had from previous years of animated works were not giving them the results they were looking for.

Calcifer & Ghost Rider

Longtime VFX supervisor Sanjay Bakshi explains, “Our fire fluid simulations are very naturalistic and they’re designed to mimic reality. With a character like Ember, it’s really important to concentrate on the performance of the face.” Unfortunately the studio was having trouble balancing the dynamism of the fire with the shape and human-like characteristics of Ember. Paul Kanyuk, a crowds technical supervisor at Pixar, says that, “at first crack, Ember looked like a ghost or even a demon. It can look horrifying if it’s too realistic, like you actually have a human figure made of real pyro.”

Taming the scary look down helped a little bit, but then it would start to look more like plasma instead the natural-looking waves of fire. In order to make Ember the way they wanted, every single frame of Elemental would need an effects pass, which would have been incredibly time consuming and very expensive.

Bakshi elaborates on what they were looking for, saying,“It was really important for Pete that Ember be made of fire, and that when she moved, she moved in a fire-like way and not adhere to a strict, skeletal structure. For example, when she reaches for something, her arm can stretch and get really narrow, like fire can. Ember needed to be able to really change shape and be amorphous. While our animators have a lot of tools at their disposal to make a character like Ember angry—from changing her posture, her eyebrows, and her facial expression—we also wanted to change the characteristics of the fire when she got angry.”

To help figure out how to solve this problem, effects supervisor Stephen Marshall and his team got right to work. “A lot of it was asking ourselves, ‘How do we make Fire and Water sentient characters that have relatable emotions that aren’t ultra-distracting?’” Marshall says. “We developed a lot of that technology very early on, and it required a lot of facility resources to figure it out. This was technology we had never used before, like machine learning. A lot of it was about blowing up what we knew about our effects pipeline and building a new pipeline around it, knowing we were going to be working with departments we had never worked with before.”

It was around 2019 when Kanyuk discovered the usage of neural style transfer (NST). This type of artificial intelligence takes a photo and then another picture of the kind of style you want the first photo to be, such as in the style of Van Goh or Picasso. It then transforms the photo into that style.

Here is an example that describes how it works:

Photo from GoDataDriven

Kanyuk thought there was a 50% chance this technique could be useful for their movie project. So Pixar turned to Disney Research Studios. Founded in 2008, Disney Research Studios is based in Zurich, Switzerland, with the other location in Los Angelos. Disney Research Studios (DRS) specializes in researching how AI and machine learning can do things like make actors appear older or younger than they really are, or changing someone’s skin quality. They also work with robotics, human-computer interaction, behavioral sciences, computer vision, and more. According to DRS, their inventions are used in almost every Disney feature film production. Pixar previously worked with DRS for Toy Story 4.

To achieve the look of the character, Pixar artist Jonathan Hoffman drew a set of swirly, pointy, and almost cartoonish flames that the team dubbed “fleur-de-lis”. Once the drawings were put into Neural Style Transfer, the AI combined the original simulation, turning the once blobbier fire into the movement and intensity of fire tempered with a bit of Pixar’s control and style.

Though this was a tremendous breakthrough technique for animation use, it was a very time consuming process, having to pass through 1,600 shots in Elemental. Not only that, but this process required a large amount of computer power and consumption. They eventually found a way to enable them to process each frame from about 5 minutes each to one second.

Here is a video from Disney Research Studios describing the process for Neural Style Transfer:

According to Bakshi, using the NST technique allowed them to “organize the flames into more stylized shapes using a machine learning technique called Volumetric Neural Style Transfer; it’s something we haven’t done before. If the flames were too realistic, it would be distracting and not as illustrative as we would have wanted.” But by utilizing techniques developed by crowds technical supervisor Paul Kanyuk and the Disney Research Studios team, he says, “It organizes the flames into much more appealing shapes. It really unlocked a lot for us—like a magic trick! There’s no other way to do this that I’m aware of.”

Marshall adds that throughout the production, “There was a loop between technology and the art department trying to discover what was working and what wasn’t. It was about getting a bunch of experts—a fire expert, a shading expert, an animation expert, a rigging expert, and a lighting expert—in the same room and iterating until we struck the right balance. It was about putting the different technologies together and training them to work together.”

Bakshi concludes with, “Were it not for the team’s unwavering efforts to balance realism with stylization, I don’t think people could connect with the characters on an emotional level.”

What do you think about Disney using AI in their movies? Do you like the innovation or do you think it’s too much reliance on AI and not enough on the creative traditional side of animation? Let us know what you think!

Source: Wired, The Walt Disney Company


Pirates & Princesses (PNP) is an independent, opinionated fan-powered news blog that covers Disney and Universal Theme Parks, Themed Entertainment and related Pop Culture from a consumer's point of view. Opinions expressed by our contributors do not necessarily reflect the views of PNP, its editors, affiliates, sponsors or advertisers. PNP is an unofficial news source and has no connection to The Walt Disney Company, NBCUniversal or any other company that we may cover.



Latest articles

Did Disney Buy Epstein Island?

A new rumor indicates that Disney has purchased Jeffery Epstein's Island for $3 billion...

New ‘Harry Potter’ Audiobooks to Feature Cast of 100+ Actors

J.K. Rowling's popular adventures of 'The Boy Who Lived' will be given new life!...

No Disney Isn’t Rebooting The Golden Girls For Disney+ In 2024?

A new rumor has the internet buzzing. It claims that Disney plans to reboot...

Slap Fight Breaks Out Among Women At Disneyland’s Pixar Pier

Just days before Disneyland began celebrating friendship at Pixar Fest, a group of women...

More like this

Did Disney Buy Epstein Island?

A new rumor indicates that Disney has purchased Jeffery Epstein's Island for $3 billion...

New ‘Harry Potter’ Audiobooks to Feature Cast of 100+ Actors

J.K. Rowling's popular adventures of 'The Boy Who Lived' will be given new life!...

No Disney Isn’t Rebooting The Golden Girls For Disney+ In 2024?

A new rumor has the internet buzzing. It claims that Disney plans to reboot...