Where does Unreal engine fit in the matte painting pipeline?

We take a look at where Unreal Engine fits into a VFX pipeline. How will the latest releases and features of Unreal affect the film production pipelines?

Where does Unreal engine fit in the matte painting pipeline?
During SIGGRAPH 2019, Epic revealed their collaborative project to showcase Epic’s latest virtual production tools. The clip made a big splash in the VFX community and many people were excited about upcoming changes in the field. Looking back at from beginning of 2021 we can definitely see some progress in adoption of Virtual Production, but not as fast as or as broad many would think.

There is a prediction that in 5 years from now, Unreal will be a standard tool in any major VFX production pipeline. As much as it sounds very promising and exciting, 5 years is mostly likely a bit of wishful thinking and marketing spin rather than a reasonable time frame for such a complex tool to be widely adopted around the world.

At this moment, major use on UE4 is in the previs or postviz area. These parts of a production are often rough and evolve quickly. Some companies are using game engines for real-time set dressing in order to achieve a bit more physically accurate results in distribution of the assets instead of scattering them. Take a look at a presentation from MPC on this topic from SIGGRAPH 2019.

It is a logical step forward from green screen approaches...

The same approach is possible using SideFX Solaris which will merge Houdini simulation tools with interactive set dressing. Some studios are utilizing the ray tracing options of Unreal Engine to render sequences or still images and pass it to the Comp department (along with AOVs) or use it as a base for Digital Matte Paintings. This approach is not truly real-time in a full sense of the term but you can get 4k resolution sequences rendered out in 7-10 frames per second which, to be honest, sounds way better than the rendering times of traditional VFX renderers where sometimes renders could take from 1-2 to 10-20 hours per frame for a noise free image.

The main focus for Epic right now is to become a virtual production option for films and television. It is a logical step forward from green screen approaches and traditionally painted on-set backdrops.

Origins

Long before digital workflows came about, movie makers were using painted backdrops and, to my surprise, some projects to this day have continued to use these techniques. For example, Harry Potter, Game of Thrones and even Netflix's Witcher were used it in 2019.

It’s obvious that painted backdrops, for some shots, can save a lot of money but it also comes with significant tradeoffs and limitations like not being able to change the lighting, the camera angle, etc.. Because of its nature, painted backdrops cannot participate in lighting of a scene either as they don't emit light, only reflect and absorb. Digitization of the traditional backdrop techniques has improved the flexibility and efficiency for VFX productions which means pushing better and better visuals, but there still remains the major problem of creating a fully seamless environment in-camera.


Light Panels

Integrating shots more seamlessly means saving money and pushing higher quality output, so the next logical evolution in the post production process was the introduction of light panels on set. These help to marry the lighting of the background to that of the foreground much better and helps reduce the amount of fixes the VFX studios needs to do to get rid of things like green-screen color spill on high reflective surfaces. These panels display a pre-rendered still or moving sequence which lights the set and provides a backdrop at the same time. The first movie to successfully utilize that idea on a big number of shots was Oblivion in 2013.

The next couple of years following Oblivion, the technique was pushed further on movies like First Man. The light panels provided a good backdrop for shots with shallow depth of field, making edges - which would otherwise be difficult to extract in post-production - and in-camera reflections, look natural. However, this technique evokes some technical difficulties in post-production. For example, often the background background ends up needing to be replaced and, with this technique, you have lost the benefits of a green-screen to quickly key out the characters to replace the background.

The new pipeline has some trade-offs and obstacles for VFX Pipelines.

The Next Big Thing?

Enter Unreal Engine.

What Unreal adds to current Light Panel background techniques is that it marries the physical camera on-set and virtual camera in the 3D scene. Think of it this way, with a light panel image you either have a restricted field of view or you have to create more on-set foreground elements and use the light panels only for the distant background and sky.

With a real-time solution, the two work in perfect sync. This means the ability to have accurate parallax of the extended environment by matching the virtual scene camera to the real-world camera through real-time tracking. Furthermore, the production can now change the position of the light source, it’s color, weather conditions and much more with a couple of clicks of the mouse and have that reflected on-set pretty much instantly.

Sounds perfect doesn't it?

Like everything else in life, this new pipeline has some trade-offs and obstacles for VFX Pipelines.

If done correctly it can be a great tool, but virtual productions like this require a lot of planning ahead. Assets need to be generated ahead of time and these assets are different from VFX assets because they need to be designed for real-time use. Furthermore, making a 3D asset look photoreal in a real-time engine can be much more difficult than in a standard VFX approach where we can spend afford to spend hours rendering the asset. All of this requires a highly experienced team backed by confidence from the publisher in the IP being successful. This is a big reason The Mandalorian is one of the few productions using this method; they have near unlimited resources and a team with a lot of experience. Of course, you also could use this approach for just a certain portion of the movie, like a special location or key moment, but currently the biggest problem which needs to be overcome is how the two industries of Games and Film are going to merge.

Unreal Engine is still a tool for game development and it will require some adjustments to make it integrate well in VFX. For instance, it is still not able to handle as much data and polygons as traditional VFX tools like Maya, Houdini or Clarisse. To achieve a similar scale, the scene's have to be constructed with multiple LOD's (level of detail) for each object in order to save memory. There is also something called a "draw calls" bottleneck if your assets are at pixel scale in a viewport... the list goes on. You have to treat Unreal as Game engine but also we are aiming for offline-quality results. This is a very fine balance that is difficult to achieve.

The Hype-train is Leaving the Station

At the moment, Virtual production (i.e. Unreal Engine in VFX) is more of a buzzword. Unreal is just another tool being added to the VFX Toolbox, but it is not completely ready for VFX yet. Epic needs to branch it out from game-dev into the VFX world in a similar way that SpeedTree has done with their Games and VFX versions of their product which handle different workflows and requirements. If enough VFX studios embrace it over time, it will eventually evolve into a core product.

The tech required for Virtual Production is not really available for everybody at this point, too. LED stages are very expensive installations which require not just special content to play but also a skilled crew to build and maintain it. The hardware costs to run an amateur level of Virtual production setups starts around $20000 without a cost of panels. Professional VFX studios are looking at 10 times this cost.

Where to now?

One of the best use-cases for Virtual Production is as a stand-in set for TV and Episodic production. Treat it like a real-time digital backlot where you can have pre-built New York or Los Angeles or Moscow and you will be able create any lighting scenario you want or any time of the year. Directors or Studios will not be limited to only what the crew could shoot on location and reliant the kinds of permits they could obtain. Driving shots will benefit from it as well as for you don’t need expensive LEDs or high end mounting tech and often the background can end up being blurry fur most of the shots.

It's also worth pointing out that a virtual production stage still gives you two options: You can shoot with an environment background as per above or just use it as a green screen. This could be a good safety net when doing multiple takes as you can safety roll back to the green-screen take later on if needed.

The Pioneers need to carve the road ahead, forging new possibilities for the industry.

Overall, and despite all the hype, it is just a different approach to filmmaking. It could expand possibilities for music video or live events too and, if done correctly, virtual projection can significantly reduce the cost of production. But it is not suitable for every director or for every movie being made. James Cameron or Michael Bay would be more likely to adopt the approach and would push it further than say, Quentin Tarantino, would.

The cost of making virtual production stages will come down eventually, too, and this will significantly help indie projects as it will be probably cheaper to use small Light stage vs doing a proper VFX post production. Even now, it is more accessible for the indie market with no cost for using Unreal Engine and a massive, low priced (compared to standard VFX), library of premade assets already available on the marketplace. Previz will also benefit from it greatly.

Downside of Virtual Production right now is it hard to judge what's needed without a test run on the stage to see it through the camera lens with the chosen LUT and stage light to see how your environment behaves. It is really easy to either under-build or over-build the environment for the shot without testing. And in order to test, VFX Studios will need to either rent, co-own or build their own stage.

Last but not least, there is a huge lack of talent on the market to make it usable worldwide. There are no education courses on the market right now for virtual production of this sort and universities likely won't have anything substantial for years due to the long process of approval for curriculums. Independent schools will pick it up faster and will educate specialists on that subject matter. But before any of this can happen, we need the pioneers of the technology to carve the road ahead, testing, breaking, and forging new possibilities for the industry as a whole.

It will be interesting to see what the next couple of years bring to the industry and how Virtual Production evolves and adapts to the realities of movie making.

~ Oleksiy