Befores & Afters Explores the AI, VFX, and Virtual Production Behind Robert Zemeckis’ Here
Before & Afters’ Issue 24 is an entire magazine dedicated to the visual effects, virtual production and machine learning tech behind Robert Zemeckis’ Here.
Journalist Ian Failes uncovers how this ambitious production came to life, with a whole issue featuring insights from the key crew including Visual Effects Supervisor Kevin Baillie, Dimension and DNEG 360 CTO and Virtual Production Supervisor Callum Macmillan, Metaphysic's Jo Plaete, and DNEG's Alexander Seaman and Martine Bertrand.
In Here, Robert Zemeckis adapts Richard McGuire’s 2014 graphic novel, spanning several decades from a single location. Shot at Pinewood Studios in London, the story is filmed from one fixed camera angle with a constant view outside a living room window.
Zemeckis and Baillie engaged Dimension and DNEG 360 for the second time, following our successful collaboration in Disney+’s Pinocchio (2022). For Here, we operated the LED stage and produced realtime content visible through the living room window over a period of over a hundred years.
The 30-foot by 15-foot LED wall (made up of Roe Bp2v2 panels with Brompton Tessera SX40 image processors), positioned outside the living room window on set, projected realtime environments over around 85 different eras.
Dimension also built a significant asset build and asset tracking role. One particular asset library was a realtime car system made in Unreal Engine. This system allowed Baillie or the first AD to use an iPad to trigger cars as dialogue was being given. Beyond cars, assets included trees and foliage, plus the colonial house visible for much of the exterior views out the window, and weather, including blizzards, lightning storms, hailstorms, rain and wind systems.
We leaned on our custom build of Unreal Engine, incorporating a suite of rendering technologies and NVIDIA’s Ada Lovelace Architecture graphics cards to orchestrate the realtime imagery on set.
The LED volume also acted as a system of shooting matte passes - a term our team coined the ‘disco pass’.
We also ensured an accurate line-up between the live-action camera, the physical set, and the virtual world – aided by nLight’s HD3D realtime depth camera. The camera was able to combine the depth maps with the main camera streamed either on-set directly in Unreal Engine or in post-production using nLight’s DCC plugins.
Continue reading to learn further about the virtual production, visual effects, and machine learning innovations of Here in befores and afters Issue 24.