Movies use visual effects (VFX) to enhance and visualize the impossible, and create scenes that could not be captured with live-action filming on its own.

VFX are rendered digitally, and the workloads have been traditionally run on-premises. Groups of creatives would sit editing frame-by-frame while the hum of servers would be heard from the next room.

But as visual effects grow increasingly detailed and computationally demanding, housing an on-prem data center for every project no longer makes sense for the sector.

Whether it is a particularly complex project that needs dramatically more computational power than the on-prem facility can provide, or it is new technology that needs the compute to be on the film set itself, the VFX IT footprint is diversifying from the cloud to the Edge.

GettyImages-1306218631 (1).jpg
– Getty Images

The continued role of on-premise computing

“On every film production, there will be lots of people working and they always need access to a central pool of storage to pull media from so the work stays synchronized, and this is how it's been since people figured out how to use network storage for video editing,” explained Arvydas Gazarian, solutions architect at Evolutions TV, a London Soho-based post-production house.

For this reason, a big part of Evolutions’ IT remains on-prem to ensure that all editors can access the correct version of the footage.

“We have 9 petabytes of media across 4 of our locations. There is at least 1 petabyte in each location, with an average distribution of 200TB per project,” added Gazarian.

But beyond that, Evolutions will turn to a cloud-type service, called European Raid Arrays (ERA). ERA offers a cloud-based workstation and storage system specifically designed for media, broadcast, and post-production workloads. Here, Evolutions has around about four petabytes of storage and close to 130 workstations in a data center, which the on-prem locations are linked to via a 40-gigabit line.

GettyImages-1356584428 (1).jpg
– Getty Images

The role of the cloud

The VFX industry is naturally ‘spikey,’ with sudden and dramatic increases in demand, explained Gazarian. A new project will be commissioned, and the compute needed will suddenly multiply, and this will sometimes bring the requirements outside of what can physically be housed on-prem.

Turning to a cloud-type service for VFX is becoming increasingly common, and was particularly publicized when Weta FX announced that Avatar: The Way of Water was rendered entirely on the Amazon Web Services (AWS) cloud.

Weta FX averaged around 500 iterations per shot, each of which contains thousands of frames. Rendering each frame took 8,000 thread hours or the combined power of 3,000 virtual CPUs in the cloud for an hour. In total, the team ran 3.3 billion thread hours over 14 months on AWS.

According to the company, achieving this on-premise would require the company to gain planning permission from the local council to physically expand its data center. Producer Jon Landau said: “The difference when we went to the cloud was truly palpable. We would not have finished but for the support we got from AWS.”

It is not unusual now for post-production houses to take this route and spin up temporary clusters in the cloud to help manage workloads. But these workloads are continuing to grow more challenging for post-production houses to handle.

Introducing in-camera visual effects

When it comes to post-production and VFX, increased GPU power and better resolution simply result in more believable simulations, and with options like moving the workload to the cloud available, it is simply a matter of being willing to throw money at the problem.

In the meantime, most VFX houses will find that a hybrid-IT solution of on-prem and cloud computing will do the trick for a traditional post-production timeline. But the sector is moving away from ‘traditional.’ No longer are all VFX done in post-production

The rise of in-camera visual effects is seeing more processing needed in real time, or even before actual filming begins.

John Rowe, head of VFX at Boomsatsuma, a university dedicated to all things post-production, explained the relatively new concept of in-camera visual effects (ICVFX) to DCD.

This approach to VFX fundamentally changes the pipeline. No longer is the simulation done in post, it is done before filming even begins, and uses the same type of servers we see in video games that enable real-time rendering. In the majority of cases, this involves using Epic Games’ Unreal Engine.

Rowe had been working with students that day, creating a shoot where the simulated backdrop of a dinosaur attack played out behind the actors in real-time.

“You take your environment to a studio and you play it on a screen and then you put your actors in front of it and film them,” explained Rowe.

“The camera has to be synced to the game engine and it uses a video processor to do that. There are lots of computers and several other people involved with that technique and it all has to lock together.

“The screen plays back your environment through a game engine so that you can look around the environment and display it on the screen, which you then sync together using another part of the same game engine to link to the camera in the computer. So when you actually move the camera, the one in the simulation moves at the same time, using a tracking system and sensors that are spread out around the studio.”

One of the companies that pioneered the use of ICVFX is Lux Machina. Lux Machina is behind the well-known graphics of films like The Irishman, and the television series The Mandalorian, among others.

A practical set piece from The Mandalorian in the StageCraft volume.   Image courtesy of industrial Light & Magic. .jpg
A practical set piece from The Mandalorian in the StageCraft volume. – Industrial Light & Magic

The world’s a stage

Lux Machina’s Prysm Stages are basically entirely new worlds, David Gray, managing director of Lux Machina Consulting, told DCD.

“The London stage is 74 meters of screen around, which is 26,000 pixels wide by 2,800 pixels high. The wall alone is nine times the resolution of 4K, and that doesn’t include the ceiling.”

Not only is this an incomprehensible amount of processing, it must also be rendered in real-time, meaning that it has to occur on-site at the Edge of production.

“Because of the resolutions and data we're dealing with at the moment. It's just not feasible [to run on the cloud] with that size of LED volume. So we're generally running a 100 gig backbone with a 25 Gig machine to be able to pull the data between them,” explained Gray.

But while it is real-time generation, it isn’t real-time content creation. There are still plenty of hours that are dedicated to the production of graphics beforehand. The lighting, generation of the assets, and their textures, are all done pre-emptively. According to Gray, this is in many ways more taxing than a traditional pipeline, as everything needs to be optimized. So why do it?

Real-time is the perfect time

The reason is that the Prysm Stages are entirely controllable environments. The images can be optimized and perfected, and lighting replicated perfectly in a second. Film directors will obsess over the concept of the ‘golden hour,’ that brief period of time each day when natural light is so beautiful it is almost ethereal. But with a Prysm Stage, the ‘golden hour’ can become a golden day, even a golden week.

ICVFX changes the scheduling for the production team significantly. For example, in typical production environments, the lighting team won’t show up until final filming is ready to commence. With ICVFX, lighting can be available from the very first test shots, at little additional cost.

“With ICVFX, if there’s a lack of ‘human’ talent, we can use Unreal Engine and produce cinematic lighting with it,” said Junaid Baig, vice president of Rendering and Research & Development at Lux Machina.

This front-ended VFX approach is new to the industry and takes a while for those working in it to get their heads around. Normally, you would look at a shot and then decide what lighting is needed, but ICVFX needs you to decide the lighting before you even see the actors in the room. To get around this, meaningful conversations are had before filming begins to really understand the director's vision.

There are some things that ICVFX can’t achieve. For example, a convincing explosion or blast would be difficult.

“The Unreal Engine can’t really do that in a way that would look convincingly good,” said Baig, “So you would shoot everything else with ICVFX, then add the explosion later as a traditional form of VFX. That way, you would only need two teams - the ICVFX team and the post. You wouldn’t need the lighting or the building team to make that happen physically.”

To achieve ICVFX, Lux Machina uses, on average, 24 rendering machines with two GPUs each - in this case, Nvidia RTX 6000s - along with various high bandwidth cables. In order for this to happen in real-time, and then be seen reflected on the LED stages, the closer to the Edge, the better.

GettyImages-1355176914 (1).jpg
– Getty Images

“VFX artists don’t really think about speed, they think about quality,” said Baig. “It has to be done in at least 24 frames per second. In a traditional VFX pipeline, you can use even 16K textures. But if you tried to run that through an Unreal Engine, or any GPU-based renderer, it’s going to crash. So with ICVFX, you are constantly balancing quality, versus quality of the Edge compute.”

The power required for this is significant but is only amplified by the sheer requirements of the stage itself. According to the Prysm Stages website, the stage production space spans 7,700 sqft, all of which is covered by LED tiles used to produce images with a resolution as big as 22,176 x 3,168 pixels on the main back wall.

“It uses a lot of power, and the building needs to be prepared to host this huge structure. Heat is a big issue, and you need to find ways to distribute the heat, and the server room itself generates a lot of noise.”

Because of this, Lux Machina stores its servers in an outside insulated room with lots of cooling equipment. For shoots that aren’t hosted at Lux’s own stages, this server room will have to be created at the new set.

Whether VFX companies are using an on-prem data center, spinning up workloads in the cloud, or creating a data center at the Edge of the film set, what is consistent across the sector is that the computational power needed to accomplish desired effects is increasing, and it is not doing so gradually.

Resolutions are increasing at exponential rates. Simulations are longer “good enough to tell the story?’ but a close-to-perfect simulation of real life, generated from less-than-perfect current technology.

Whether this is done in post or pre-production, visualizing the impossible has become increasingly virtual.