The Third Floor Case Study
At the Forefront of Real-Time Visualization with The Third Floor
The Third Floor is one of the leading visualization companies in entertainment, collaborating creatively and technically across the industry to help clients realize their most ambitious ideas. In The Third Floor’s wheelhouse of film and television, the company’s artists work with studios and production teams from preproduction to post production to help map out story action and shooting plans in previs and techvis. The Third Floor’s virtual production and on-set expertise has also supported new ways to pull off complex shots, tying live action, CG characters and virtual environments together more closely.
Recent projects have included innovations to sync up dragon flights, riders, practical fire digital and visual effects on HBO’s “Game of Thrones,” and expansive previs and techvis for everything from massive battles to emotional character moments in Marvel’s Avengers: Infinity War. The Third Floor also has long-standing expertise in visualization for theme park attractions, commercials and game cinematics and recently expanded into VR/AR content creation.
The Third Floor’s Global Head of Research and Development is Addison Bath, who manages a team of engineers in exploring the latest technology and creating customized workflows based on specific project needs. Bath explains, “One aspect unique to our approach is that we tailor our tools and workflows to specific projects and even to specific directors, because we’ve found that that level of customization helps put everyone in a comfort zone to facilitate their creative goals.”
Since inception, The Third Floor has relied on real-time rendering tools in order to visualize and iterate as quickly as possible, with workflows initially based around Maya Viewport, and later Viewport 2.0. In 2016, a convergence of dramatically improved GPU capabilities and newly-improved visual fidelity within real-time game engines led Bath to begin testing game engines such as Unreal Engine and Unity.
“Game engines presented an opportunity to take real-time rendering to a much higher level,” he said. “There are a lot of other things that interested us about game engines as well, for instance, they allow us to simulate lighting more accurately, which is crucial for successfully blending real and virtual assets.”
Following a meeting at GDC 2016, Bath’s team partnered with Glassbox to test real-time workflows powered by game engines on a demo project called “The Green Wedding.” By leveraging Glassbox’s expertise in game engines with The Third Floor’s extensive virtual production know-how, the teams were able to not only recreate The Third Floor’s MotionBuilder-based workflow inside of Unreal Engine, but also improve upon it with key additional functionality. The result was a powerful virtual camera toolkit, which allowed The Third Floor team to see what they were shooting with the virtual camera in VR or on an iPad just as if they were shooting in live action, and live stream motion capture data into the engine. This toolkit eventually evolved into what is today known as Glassbox’s DragonFly, an off-the-shelf, cross-platform virtual camera solution.
Following the success of the demo project, Unreal Engine has become a regular part of the tools The Third Floor uses on projects. The Third Floor has also continued to work with Glassbox on the development of DragonFly.
“Glassbox has been an invaluable collaborator for us, helping us achieve key goals with expanded degrees of technical knowledge. Tapping into this level of engine expertise when we are spinning up workflows for new projects helps us ensure that we are always using the most efficient and up-to-date solutions,” shared Bath.
As technology has continued to evolve, Bath and The Third Floor are working to make virtual production techniques efficient and affordable enough to be used during previs, and the virtual camera plays a crucial role.
“Visual effects have traditionally channeled a lot of the effort into the latter parts of filmmaking because so much had to be done in post,” Bath explains. “Now that we can get better quality in real time earlier and earlier in the process, it’s easier for everyone working on the production -- production designers, set dressers, DPs, to collaborate together for complex shoots. Now in previs, rather than creating shots and camera moves to present to the filmmakers for notes and revisions, we can work with directors and DPs as they map out and move the camera on the fly in the virtual versions we’ve created of the scene.”
On set during production, real-time tools can also help actors and directors develop and record performances for CG characters in key storytelling roles.
“Being able to actually see a working version of a CG character while you’re still on the set is huge; it provides a whole new feedback loop on how to approach the performance,” said Bath. “It’s the same theme of giving the creatives more immediate feedback and control for the process through real time. We’ve seen that on multiple Avengers movies The Third Floor has worked on, where the actors playing CG characters like Ultron or Hulk have been able to reference their performance with movements tracked to CG character via live renders on the monitor.”
Bath is also excited by the potential for live compositing, where CG elements and backdrops can be rendered in real time and overlayed on set on the monitor. He observes, “The closer you can get to simulating the whole visual picture, the easier it is to make important creative and technical decisions that can save time and money later on.”
Looking even more to the future, Bath predicts that real-time tools will drive an exciting convergence of entertainment across platforms – for instance, with our interactions in AR being able to drive what we see in a game or a film tied to the same IP.
As real-time capabilities continue to improve, the teams at The Third Floor and Glassbox will be looking to help bring further new processes and efficiencies into visualization workflows.