SIGGRAPH 2023: The Way of Weta

With SIGGRAPH 2023, a conference hosted by a global nonprofit organization serving the evolution of computer graphics and interactive techniques, it comes as no question that this year’s conference, currently taking place in Los Angeles, would have a special panel focused on the visual effects of Avatar: The Way of Water.

After all, the production marked the largest Visual Effects project taken on by Weta FX, as they were responsible for 3,240 shots (98% of the total shots in the film), 2,225 featuring water, a notoriously difficult substance to create artificially.

To get to a place that allowed Weta to where they can create “emotionally engaging digital characters that blended naturally with their live action counterparts in photorealistic CG environments,” they needed new water and facial animation techniques as well as more advanced virtual production workflows.

During the panel today at SIGGRAPH 2023, we were treated to a look at these advancements, hosted by Weta FX’s Senior VFX Supervisors Joe Letteri and Eric Saindon, Pre-Production Supervisor Marco Revelant, FX Supervisor Nick Illingworth, they shared over five years worth of Research and Development that went into the production, including a real-time in-camera depth compositing system, as well as underwater performance capture and a new performance-driven cable-cam eyeline system developed with Lightstorm Entertainment.

While commonly referred to as Motion-Capture (or MOCAP), that was not a term you heard during the panel, with “Performance Capture” being touted heavily. After all, their systems were not just capturing the motion of the actors, but rather their full performance, complete with emotion and gestures. But with most of the movie taking place in and around water? How did that work out? The actors would find themselves in pools with thousands of beads bobbing on the surface. The beads would eliminate any reflection of light that would throw off the cameras and sensors while still creating a delineation of the water’s surface, with different techniques capturing what is going on above the surface and below the surface.

Oh, and even though performances are taking place underwater, no scuba systems were allowed. Why not? Scuba creates air bubbles, which would confuse the systems, mistaking the bubbles for the sensors that the performers are wearing.

Then, in one of the biggest advancements from the first Avatar to the second, these captures can be actively worked on by animation teams almost in real time. After all, a human in a suit doesn’t have a tail like the characters they play in the film, so those kind of details can be worked on while there is active production, saving a lot of time thanks to a new workflow pipeline.

With the Na’vi being so tall, one of the details that they wanted to get right was eyelines – where the actors are looking. Similar to an NFL style cable-cam that knows where to go and what to follow in a stadium, a rig was developed that follows the paths that actors would walk in a scene, and fitted on the end of that rig was a small monitor that showed the face of the actors giving their performance that will be rendered digitally. Making eye contact with something that isn’t there already proves difficult as it is, but it was apparently even more difficult for director James Cameron to work with in the simul-cam. The simul-cam, which integrates the live action footage with animation to help grasp the final product before it is done, just wasn’t working on The Way of Water the way it did with Avatar. That’s where the new system comes in.

Courtesy Vanity Fair

Courtesy Vanity Fair

Here you can see the new camera in action, with the little monitor featuring the actor replaced later digitally by the full-size Navi.

With this new system, there was also another bit of filmmaking to take into consideration. The depth of everything in the frame. You have humans interacting with Navi as they make their way through the set too. So much interaction with things that aren’t necessarily even there. Similar to the eyeling system they incorporated, the team also created a new way to work with depth in real time. By attaching two smaller cameras to the main camera, they now had a system they built digitally that can help the VFX team work with plate composition while the filming is taking place, expediting the process later on. Without it, similar shots would not only be almost impossible, they’d have to be far more controlled and planned to a t.

One of the things the panel discussed was also, in this same vein, how in years past VFX teams had no say in what happened during filming – that VFX reigned during post-production. Not on the shoot.

Now, especially with these advents, the team was there working with James Cameron, and even directors of photography, and those major players on the film set. Not relegated to their studios offsite to be dealt with months later.

Of course, some fun facts were also shared – including the fact that the film used 85 live sets, 13 wet sets and had 2,225 visual effects shots featuring water. All worked on by 1900 artists. But most interestingly, we got a glimpse into how far technology has come – learning that using one processor that helped make Avatar, Avatar: The Way of Water would have taken approximately 350,000 years to complete. After all, some frames of the movie are multiple terrabytes in size.

You can see the final product of all this work in Avatar: The Way of Water, now streaming on Disney+.

Sign up for Disney+ or the Disney Streaming Bundle (Disney+, ESPN+, and ad-supported Hulu) now

Tony Betti
Originally from California where he studied a dying artform (hand-drawn animation), Tony has spent most of his adult life in the theme parks of Orlando. When he’s not writing for LP, he’s usually watching and studying something animated or arguing about “the good ole’ days” at the parks.