Studio production xR workshop: Five things you need to know about xR and broadcast

December 10, 2020

FACE PRO

In the first of a series of xR workshops, disguise invited White Light’s Technical Solutions Director, Andy Hook, onto their London HQ xR stage as a guest speaker to talk about their work using extended reality in studio-based production.
Studio production xR workshop: Five things you need to know about xR and broadcast

Hosting the session was disguises own Global Technical Solutions Manager, Peter Kirkup and Phil Cooksey, Sales Director EMEA.

Available to watch as a recorded webinar, the Studio Production xR Workshop highlighted how the disguise workflow is delivering unrivalled, engaging visual experiences for clients around the world. Here, they've pulled together our top five reasons why creatives and technologists are turning to extended reality and the disguise xR workflow for their studio productions.

Watch the full recording of our Studio Production xR Workshop

1. Getting the best performance out of your presenters

The LED volumes that makeup xR stages not only surround presenters in lavishly designed sets but also allow them to interact with those virtual and physical sets in the most natural way. If a director asks the presenter to engage with an element on stage, they can do so with no other instruction because they can see the object displayed on the LED panels.

This is real engagement with the same content the audience is seeing without the need to fake any interaction. It was this ability that inspired Andy and his team to experiment with disguise beyond their work with Eurosport at the 2018 Winter Olympics in Pyeongchang. They wanted to build upon the genuine engagement they were witnessing between athletes and the xR stage. When these sporting professionals saw their medal-winning performances for the first time the cameras were able to capture the unrehearsed joy and emotion of the moment.

Credit: GeeFX Studios

2. Render content from the camera’s perspective

By using disguise, producers can give the illusion of a rich, deep studio even if they're only working with a stage a few metres wide and high. By plugging camera tracking data into disguise’s software, engineers can create a denial of perspective; making it possible to render content from the camera’s view. Using camera tracking technology - our stages are equipped with Stype, MoSys and ncam equipment - real-time content is generated into the LED walls based on camera positions. This is the basis of our virtual studio setups.

This technique holds the key to extending these small sets into large virtual spaces. The disguise workflow can tell the system that it is seeing something that isn’t there. When set extensions are switched on, the virtual world continues beyond the LED screens for the viewer at home. It can be rendered in all directions around the stage accommodating the largest possible programming scale from the most compact stage.

Credit: White Light

3. Bringing augmented reality to the fore

Using the same camera tracking capabilities, disguise can place augmented reality objects in the camera view. By utilising what we call the front plate, Peter was able to introduce an image of a formula one car in the foreground of the virtual stage. Because it was also reproduced on the LED panels, he was able to see it, direct his attention to it and present it to the audience. The disguise workflow can bring a front plate of AR into any environment using a second, possibly different, render engine which means the home viewer sees the AR as an overlay.

During this workshop, the London team were using Notch to render the car while Unreal Engine dealt with the rich background. Launched with disguise software release 17.4, our RenderStream technology brings these different sources together to create one, seamless virtual studio space.

4. disguise’s modular approach to unifying studio productions

The modular makeup of disguise is how everything comes together, and it doesn’t stop with the virtual production. In the early days of our relationship with White Light, the disguise workflow was introduced to the team’s projects as they began to recognise the need to unify multiple servers, monitors, projectors, lighting controls and introduce elements like interviews with speakers from remote locations.

Building on the use of disguise’s full broadcast infrastructure, White Light began recording the positions of monitors displaying continually updated content. The use of camera tracking came during the 2018 FIFA World Cup in Russia where the team created a set extension around a real window looking out onto Red Square. The inclusion of AR elements was first introduced as part of Eurosport’s coverage of the Olympic Winter Games. But the innovations don't stop there.

Credit: White Light

5. Teleportation: the next generation of broadcasting

A brand new technique pioneered by White Light is the idea of teleportation. Beginning with the 2018 Winter Olympics, but used right up until this year’s US Open, disguise can instantly transport personalities from one location to another by displaying them inside the LED video wall and in-camera using tracking systems. This on-screen/in-screen magic provides natural eye lines for presenters to engage with the avatar of the other person and have meaningful conversations with them.

Credit: White Light

Why use disguise for studio production workflows?

Thanks to its modular makeup, producers can scale to multiple cameras without worrying about latency issues between each of those sources. Operators and directors can also see previews of shots when they are working off just one LED wall. Another benefit of using disguise highlighted by the speakers is its ability to scale rendering power; allowing more nodes to be added for more complex scenes, from one render engine or many. But the real advantage of using disguise, particularly in Andy’s mind, lies in its unified hardware and workflow.

To quote White Light’s Technical Solutions Director: “Whether you’re projection mapping something, doing traditional augmented reality on top of a standard, existing set, building a complicated hybrid set with physical screens that you're perspective tracking with some AR, or you're going all out with a fully immersive xR solution or smart stage, the same hardware, software, workflow and skillset will provide the same stunning results. disguise’s technology that you might already have investing in can be repurposed in lots of different ways. Not only that, but it can be added to overtime to create these fully immersive xR environments eventually.”

Studio production xR workshop: Five things you need to know about xR and broadcast

FACE for professionals

FACE for MI Resellers

FACE Project Integration

In the first of a series of xR workshops, disguise invited White Light’s Technical Solutions Director, Andy Hook, onto their London HQ xR stage as a guest speaker to talk about their work using extended reality in studio-based production.

Hosting the session was disguises own Global Technical Solutions Manager, Peter Kirkup and Phil Cooksey, Sales Director EMEA.

Available to watch as a recorded webinar, the Studio Production xR Workshop highlighted how the disguise workflow is delivering unrivalled, engaging visual experiences for clients around the world. Here, they've pulled together our top five reasons why creatives and technologists are turning to extended reality and the disguise xR workflow for their studio productions.

Watch the full recording of our Studio Production xR Workshop

1. Getting the best performance out of your presenters

The LED volumes that makeup xR stages not only surround presenters in lavishly designed sets but also allow them to interact with those virtual and physical sets in the most natural way. If a director asks the presenter to engage with an element on stage, they can do so with no other instruction because they can see the object displayed on the LED panels.

This is real engagement with the same content the audience is seeing without the need to fake any interaction. It was this ability that inspired Andy and his team to experiment with disguise beyond their work with Eurosport at the 2018 Winter Olympics in Pyeongchang. They wanted to build upon the genuine engagement they were witnessing between athletes and the xR stage. When these sporting professionals saw their medal-winning performances for the first time the cameras were able to capture the unrehearsed joy and emotion of the moment.

Credit: GeeFX Studios

2. Render content from the camera’s perspective

By using disguise, producers can give the illusion of a rich, deep studio even if they're only working with a stage a few metres wide and high. By plugging camera tracking data into disguise’s software, engineers can create a denial of perspective; making it possible to render content from the camera’s view. Using camera tracking technology - our stages are equipped with Stype, MoSys and ncam equipment - real-time content is generated into the LED walls based on camera positions. This is the basis of our virtual studio setups.

This technique holds the key to extending these small sets into large virtual spaces. The disguise workflow can tell the system that it is seeing something that isn’t there. When set extensions are switched on, the virtual world continues beyond the LED screens for the viewer at home. It can be rendered in all directions around the stage accommodating the largest possible programming scale from the most compact stage.

Credit: White Light

3. Bringing augmented reality to the fore

Using the same camera tracking capabilities, disguise can place augmented reality objects in the camera view. By utilising what we call the front plate, Peter was able to introduce an image of a formula one car in the foreground of the virtual stage. Because it was also reproduced on the LED panels, he was able to see it, direct his attention to it and present it to the audience. The disguise workflow can bring a front plate of AR into any environment using a second, possibly different, render engine which means the home viewer sees the AR as an overlay.

During this workshop, the London team were using Notch to render the car while Unreal Engine dealt with the rich background. Launched with disguise software release 17.4, our RenderStream technology brings these different sources together to create one, seamless virtual studio space.

4. disguise’s modular approach to unifying studio productions

The modular makeup of disguise is how everything comes together, and it doesn’t stop with the virtual production. In the early days of our relationship with White Light, the disguise workflow was introduced to the team’s projects as they began to recognise the need to unify multiple servers, monitors, projectors, lighting controls and introduce elements like interviews with speakers from remote locations.

Building on the use of disguise’s full broadcast infrastructure, White Light began recording the positions of monitors displaying continually updated content. The use of camera tracking came during the 2018 FIFA World Cup in Russia where the team created a set extension around a real window looking out onto Red Square. The inclusion of AR elements was first introduced as part of Eurosport’s coverage of the Olympic Winter Games. But the innovations don't stop there.

Credit: White Light

5. Teleportation: the next generation of broadcasting

A brand new technique pioneered by White Light is the idea of teleportation. Beginning with the 2018 Winter Olympics, but used right up until this year’s US Open, disguise can instantly transport personalities from one location to another by displaying them inside the LED video wall and in-camera using tracking systems. This on-screen/in-screen magic provides natural eye lines for presenters to engage with the avatar of the other person and have meaningful conversations with them.

Credit: White Light

Why use disguise for studio production workflows?

Thanks to its modular makeup, producers can scale to multiple cameras without worrying about latency issues between each of those sources. Operators and directors can also see previews of shots when they are working off just one LED wall. Another benefit of using disguise highlighted by the speakers is its ability to scale rendering power; allowing more nodes to be added for more complex scenes, from one render engine or many. But the real advantage of using disguise, particularly in Andy’s mind, lies in its unified hardware and workflow.

To quote White Light’s Technical Solutions Director: “Whether you’re projection mapping something, doing traditional augmented reality on top of a standard, existing set, building a complicated hybrid set with physical screens that you're perspective tracking with some AR, or you're going all out with a fully immersive xR solution or smart stage, the same hardware, software, workflow and skillset will provide the same stunning results. disguise’s technology that you might already have investing in can be repurposed in lots of different ways. Not only that, but it can be added to overtime to create these fully immersive xR environments eventually.”

TAGS

No items found.