DNeg: creating reality in an unreal world for “The Matrix”

0

When The Matrix Resurrections, the fourth installment of the popular sci-fi movie series, finally released, viewers were ready for stunning visuals. After all, it was a
Matrix movie And helping to make this a reality required some amazing work from a number of VFX facilities, including DNeg, the leading effects provider on the movie.

In total, DNeg was responsible for 723 shots, split between the London, Vancouver and India installation sites. According to Huw Evans, VFX supervisor of DNeg (London) on
Resurrectionsthe work involved quite a mix: particularly heavy and incredibly complex environment builds, as well as creature work for the Synthients, digital doubles for the lead actors, complex effects development work, multiple shots fully CG, tricky weathering work, and a host of the usual wire/crew removal tasks.

Much of this work took place in the so-called “real world” of The Matrix. In particular, the London crew was responsible for the sequence where Neo wakes up in the Anomaleum chamber, his escape through the power station, the underground megalopolis of Io, the Machine War flashback, and scenes involving CG ships traveling through tunnels or abandoned structures. The group also handled the out-of-the-real-world sequence involving Neo’s fight in the dojo.

The Vancouver team, led by Ahron Bourland, worked on the modal rooftop chase (a test sandbox for game code) at the start of the film, the theater where Neo takes the red pill, the segment of the bullet train and the flashback to when Neo and Trinity were being operated on by the machines in the lab. The India team, meanwhile, supported the work in the other two localities.

The project presented a multitude of technical challenges in various areas. In particular, the scale and density of detail required for the huge environment builds was top of the list, along with lighting and rendering them at 4K in the multitude of fragmented passes, which had to be combined in the final composition. . .

In fact, the London DNeg team had to devise new ways to manage and pass their assets between departments. “While we were confident that we could use USD to build the assets, and then the scatterers, to help bring a lot of geometry into RAM at render time, a separate concern was to transfer those assets to other departments. “, says Evans. To achieve this, the group organized their work so that specific areas could be produced as USD hero assets, with enough detail to allow the animation to interact with the geometry, but still light enough. to allow them to work with usable frame rates in Autodesk Maya.

Rendering time, however, was difficult due to this overall complexity and full-4K working resolution. To cope with the scale and complexity of the environments, there was a clear need to establish a system to ingest, manipulate, and standardize exports for memory-intensive assets, according to Evans. “We created a bespoke FX pipeline that slotted between the environmental and lighting departments, allowing us to collect, process and transmit data as efficiently as possible,” he adds.

At the dojo

At one point in the film, the new Morpheus (Yahya Abdul-Mateen II) and Neo find themselves in a dojo, a modern-style structure set in the middle of a lake with colorful foliage trees visible along the banks. Morpheus’ goal: to find out if Neo has retained his kung fu skills. The answer is yes, although it takes a while for those skills to come back.

To create this sequence, DNeg didn’t call on his own muscle memory, but instead used a new move: using Epic’s Unreal Engine to render these CG movie environments. “We wanted to push real-time rendering in Unreal to achieve final quality renders, which hold at 4K resolution, not just in the background on an LED display, but in the center of a fully digital environment,” says Evans. .

These scenes – which involved live actors and a hands-on setting, as well as the digital environment – ​​marked the first time that DNeg had used the game engine for a full film sequence. As Evans explains, many of the tools and techniques used for rendering visual effects were not yet available in Unreal Engine.

“We were using state-of-the-art custom builds, giving us OCIO color support and layered rendering, to name a few, that weren’t originally present when this work has begun,” says Evans. “It gave us the ability to run renders through our compositing pipeline, helping us push the quality level where we needed it for an all-digital background with a forest of trees and falling leaves. as far as the eye can see and a large lake with the wind blowing on the surface of the water.

Getting high-quality results straight from Unreal helped the band quickly lock down footage and understand the geography between shots, while also giving the editorial something to cut with. The opening shot contained a combination of code revealed in the real-time Unreal environment, flying up to the digital dojo, designed to closely match the practical set on set built using Lidar and photography of texture. It is here that a nearly dejected Neo is shown standing inside.

Morpheus was a full hero digi-double that DNeg had to match and mix between two takes of a practice performance, then simulate and transition between two digital costumes to create the final effects. Throughout the sequence, artists performed digital head replacements to remove stunt performers, digital set extensions, and set repairs to provide continuity with damaged sections. This all culminated in the destruction of the dojo, which was designed in such a way that it could artistically smash and smash, while still feeling grounded in reality.

For an in-depth look at DNeg’s work on the film, see the January/February/March 2022 issue of CGW.

Share.

Comments are closed.