THE UNREAL RIDE

CASE STUDY

A  motion control & virtual production collaboration between MRMC & VU Studio at NABshow 2022

A  motion control & virtual production collaboration between MRMC & VU Studio at NABshow 2022

Project Overview

MRMC has been attending NABshow for over 20 years, and has built a strong reputation for putting together some show-stopping activations in the central lobby for delegates to enjoy. With the return of live events in 2022 after a hiatus due to COVID, we wanted to come back with a real game-changer for the show that would showcase some of the most innovative technology from the world of camera robotics and beyond. With the unstoppable rise of virtual production in the world of film & TV, we approached one of the true innovators in this space, VU Studio, to put together an idea for an interactive experience that would demonstrate the two companies’ technologies that would elevate video production capabilities to unprecedented levels.

The Unreal Ride Concept

Following some fantastic creative sessions with the Team at VU Studio, they proposed the idea of an interactive user experience called the ‘Unreal Ride’. The idea was to combine the use of the physical and the virtual worlds and capture it all using motion control. The Unreal ride would take the delegate into an immersive virtual production experience and capture a video of themselves moving through a futuristic cityscape virtual environment as if it were the real thing. Participants would sit on a custom-made futuristic motorcycle prop against the backdrop of a giant LED screen. When captured through the lens it would appear as if they were riding through the virtual futuristic cityscape. The Bolt X Cinebot, which has the longest arm reach of MRMC’s high-speed cinema robots at 3.2m captured the footage & demonstrated the versatility of virtual production and motion control with moves programmed in our industry-standard Flair software. At the end of the ride, the video footage was given to each participant via a code to share on their social media channels, which was captured by MRMC’s proprietary Showbolt software tool.

We were thrilled to partner with Vū Studio at this year’s NAB Show conference. The experience we crafted with them really demonstrated how our motion control technology, coupled with Vū Studio’s Virtual Production LED Volumes, created a hugely exhilarating and unique video production for visitors to the show. The exhibit was a total roadblock, with people queuing day after day to take an Unreal Ride right up until the show closed!

Dan Brooks

Head of Marketing - MRMC

Creating the Unreal Ride - By Ben Meyers Bolt Operator / DP - VU Studio

The idea was to create a set that felt exciting and interactive. We wanted to gamify the experience. In the traditional green screen environment, everyone has to pretend that they’re in the environment whereas with real-time ICVFX and LED technology, even the talent can see and feel the environment in real-time. It adds a level of realism to the performance and that’s what we wanted to showcase. We didn’t hire models or talent to come and perform on our set. We let show attendees hop on the Tron-inspired light bike and show off their acting skills on a real ICVFX shoot. 

Since the technology is so new and cutting edge, we thought it would be fun to put them in a futuristic world. Something that sparks that child-like sense of imagination. We settled on a Tron-inspired light bike because we knew that it would give us an advantage when dealing with different heights. When preparing for a robot shoot, you can load in different jobs for different heights but these take time to load and when you’re filtering through hundreds of people per day, that’s not something that you can afford to do. So by placing people on a motorcycle, we had a lot more control over where the focal points would be making the entire experience run a lot smoother.

One of the most exciting things about Unreal is its speed. Not just the rendering speed but the speed of building environments as well thanks to all the talented artists that contribute to the marketplace. The cityscape and neon signs were purchased from the marketplace, and we could modify it to make it fit our needs relatively quickly. Naturally, we would have loved more time to fine-tune everything but that’s just how it goes sometimes and it’s a testament to the true strength of Unreal Engine which is the speed. 

 

There was a lot of technology featured in this demonstration and there was a lot that could go wrong. We considered playing a video on the LED wall instead of using Unreal Engine in real-time but ultimately chose to have Unreal running in real-time to show off the true power of this kind of setup. Thanks to the functionality and well-thought-out design of Flair from MRMC, we were able to send an OSC signal from Flair into a Mac that was controlling the surround sound Genelec speakers. This signal was triggered every time the move was run so that the music would fade out and the audio track that was designed for the ride would fade in very precisely. We managed to get the entire experience down to 1 button push. Flair not only had the ability to trigger the audio but also the background sequence inside Unreal Engine.

As for camera tracking, fortunately, we didn’t have to worry about that thanks to the proprioception of the robot from MRMC. All of their motion control rigs have to know exactly where the camera is in 3D space in order for target tracking to work. This motion data can be piped into Unreal Engine in real-time and linked to a virtual camera through the LiveLink plugin developed by MRMC and RiTE Media. 

Mounted on the Bolt X was the Sony Venice with a Cooke 30-95mm Verotal Zoom. This combo played really nicely with the LEDs and allowed us to get all the shots we wanted with beautiful colours and no more. Moire is an artefact that appears when cameras try to record very tight patterns. With LED walls, the space in between the pixels can create these artefacts. We’ve found it best to have the walls just slightly out of focus or to use a lens that blends the pixels together better. Your camera and lens choice are very important in an LED environment and this combo was a powerhouse for us.

By connecting our LED wall with the Bolt X robots motion data, we were able to create a virtual environment that is indistinguishable from reality. The possibilities of MRMC’s Motion Control technology paired with our LED Volumes are truly endless, that’s why they are game-changers for the industry.

Daniel Mallek

Director of Content & Innovation - VU STUDIO

Motion Control & Virtual Production in Perfect Harmony

Flair is an insanely powerful tool that has a lot to offer. The engineers and developers at MRMC have built rigs and software that work really well together and give you so much creative freedom to get exactly what you want. With ICVFX and LED walls, camera tracking is a crucial element. Without it, you would have no inner frustum, no parallax, and no depth. Camera tracking allows you to control your virtual camera with your physical camera, and thanks to motion control with Flair, you can send the camera’s positional data to the Unreal PC in real-time making ICVFX shoots feel more like practical shoots.

When it comes to programming moves, it feels very similar to working in After Effects in a few ways. The approach that I’ve always taken is to visualize the move that I want to create first and foremost. From there I will position the camera keeping composition in mind and then store my first keyframe. I will then move the camera to each subsequent position, storing keyframes along the way. The tricky part is that you also have to consider where the robot will be in 3D space, making sure to avoid operating near limits. There are times when I have to move the track base so that I can allow the robot arm to position the camera where I want it for example. While at NAB, we were operating on a patch of carpet with some sandbags.  Keeping this in mind, I was careful to only program moves along the track when necessary and when the lens was at a wider field of view. I also kept the move at a slower speed than the rig is capable of. Part of being a good operator is knowing the limitations of the equipment and creating moves that play to its strengths. The Bolt X arm is capable of moving the camera 13’ in all directions and 14’ vertically. Given this extra reach, I chose to use the arm more than the base and kept my fairings smooth so that there weren’t a ton of abrupt starts and stops while the rig was operating on the carpet. I’m still early in my motion control journey so the fact that I was able to create this project for NAB in such a short amount of time is a true testament to how user-friendly the experience is inside of Flair. It’s a tool that has opened up my creative possibilities and I’m extremely excited to continue learning more about what it has to offer!

Videos

Other Case Studies

Contact Us