LAS VEGAS—Over the last 18 months, the boom in virtual production has shaken up the way filmmakers, broadcasters, mobile video designers and more conduct their creative business. Those technologies enabling virtual production include augmented reality (AR), virtual reality (VR), extended reality (XR) and mixed reality (MR).
FOOTBALL’S BIGGEST STAGE: Augmented reality was used on one of the biggest stages in sports when the Super Bowl LVI halftime show created a 360-degree augmented reality experience for fans to view on their mobile devices. For the event the organizers used Insta360 Pro 2 cameras from Insta360. This 360-degree camera with six F2.4 fisheye lenses includes nine-axis gyroscopes and FlowState stabilization that provides stabilization against moving scenes.
The Pro 2 cameras were positioned around the stage and footage was streamed via the Verizon 5G network to a halftime show app. Within the app, fans were able to choose any of the eight different cameras around the field and pivot around the stage and field to get a 360-degree view, as if they were there.
Insta360 also helped pull off the world’s first 8K live-stitched, live broadcast of an NFL game when it covered the NFL Pro Bowl in Miami in February.
JUDO STATS IN AUGMENTED REALITY: Augmented reality was a big part of the 2022 Judo Grand Prix. Held in Almada, Portugal, in February, the International Judo Federation (IJF) and wTVision joined forces to revamp the use of space, data and broadcast design during this three-day annual event. First held in Tokyo in 1956, the event pits more than 300 competitors from 41 countries against one another.
This year, real-time data and broadcast graphics were created via wTVision’s JudoStats CG alongside the R3 Space Engine rendering system. Augmented reality graphics — created by the JudoStats, R3 Space Engine and a tracking system from Stype — incorporated real-time statistics, players’ information and performance data to create an interactive environment.
VIRTUAL STUDIO AT WIMBLEDON: BBC Sport’s coverage of the annual Wimbledon Championships in 2021 received an upgrade with virtual reality graphics technology from Brainstorm. The BBC, in consultation with the UK graphics company MOOV and Brainstorm, created a virtual set studio. The studio used augmented reality graphics using Brainstorm’s InfinitySet XF/AR/VR studio application and integrated with the Epic Games creation system Unreal Engine.
Focused on the goal of telling better stories through improved data and graphics, a green screen was used to create a virtual Wimbledon studio. The technology was also used to provide new and alternative camera angles, infographics and Wimbledon branding. During the event, five cameras were used in the studio, each equipped with a Brainstorm InfinitySet system with Unreal Engine. Advanced compositing features were used to create realistic real-time 3D scenes with elements that look and behave like those in the real world.
“We managed to create a virtual space that allowed for a dynamic coverage of the tournament, which seamlessly combined real and virtual imagery in a way that significantly enhanced the story and audience engagement,” said Nev Appleton, director and co-founder at MOOV.
GREEN SCREEN COMES TO THE LECTURE HALL: Virtual reality is enlivening the lecture hall. The University of the Netherlands worked with Zero Density to create a massive virtual lecture hall using the company’s Reality Engine real-time broadcast compositing system.
Using the virtual space, scientists and professors have access to interactive storytelling techniques and advanced visualization methods — complete with real-time realistic reflections and refractions of physical objects and people inside the green screen. Professors are able to explain complex matters on a large virtual screen or run physical experiments with immersive real-time graphics.
The space employs Grass Valley LDX 86N 4K cameras alongside Mo-Sys camera tracking technology while the green screen lecture space itself runs on two Reality Engines.
LONDON WELCOMES VIRTUAL PRODUCTION: When Garden Studios in London finalized its virtual production stage in 2021, the result was a 4,800-square-foot space that could serve as a cost-effective filming option with unique creative opportunities. The new virtual production studio at Garden Studios allows filmmakers to shoot real-time virtual effects on set by using virtual graphics displayed on an LED volume (an enclosed space where motion capture and compositing can take place) to create photo-realistic backdrops.
Among the technologies in place are the VP Pro XR server and StarTracker camera and lens tracking systems from Mo-Sys Engineering. Features within the VP Pro XR are an Unreal Engine editor interface and feature known as Cinematic XR Focus allows that a filmmaker to pull focus between talent or objects in the physical studio and virtual objects positioned in the LED volume.
According to the company, this means that an LED volume can be used as more than just a backdrop but rather can deliver better interaction between real and virtual elements. Jillian Sanders, virtual production coordinator for Garden Studios, said the Mo-Sys team came to the studio for a multitude of tests including displaying digital tracking markers on the studio’s LED ceiling.
“We’ve also been able to assist with testing and development of their new VP Pro XR,” she said. “This exciting new tool allows for features such as digital set extensions, the ability to focus past the LED wall into the digital world, and near time rendering.”
Copyright NAB Show Daily