Designed for live sports capture, Polycam One is an innovative, portable, multi-camera workflow solution that gives a single operator the ability to record a match from multiple camera positions simultaneously.
Polycam One maps out the playing area and the locations of up to 12 robotic cameras. The camera operator follows the action with the ‘main’ camera, while Polycam One controls the additional robotic cameras, following the focus of the ‘main’ camera.
MRMC’s Algorithmic Camera Control lets the operator concentrate on covering the action while the Polycam One system provides complimentary shots, such as a wide or high tactical view, which can be used for highlights, replays or adjudication.
The robotic cameras are IP connected, allowing the operator, or an additional remote operator, to adjust and control them in real time providing a more dynamic visual experience.
Easily deployed and configured by a single operator, Polycam One is perfect for use in multiple locations and can be set up in venues where conventional camera positions are not available. The portable nature Polycam One makes it ideal for smaller or temporary venues that may not have the infrastructure to support automated tracking solutions, and also gives sports teams the ability to easily capture footage from multiple angles at training grounds.
Join Us for the Latest MRMC Technology at IBC Show
After a huge show at NAB earlier this year occupying the largest stand we’ve ever had, we now plan for IBC 2018. Partnering with Nikon B.V., this year we’ll have multiple stands, demonstrating the latest MRMC technology at IBC, including a highly exciting interactive Bolt display!
Sharing the show with Nikon B.V., our main location will be in Hall 10, stand D.26, (map location) where we’ll be featuring intelligent, low footprint multi-camera robotics solutions for applications such as live events and pop-up studios. We’ll have hands-on demonstrations of Polycam Chat and Polycam Player, used with our IP controlled Robotic Pod (housed with a Nikon D5) and AFC-100s showcasing both fully automated and user augmented touchscreen motion control. Additional robotic camera solutions on the stand include the Studiobot (a 6-axis alternative to traditional robotic pedestals and dollies for TV and news studios); the high payload capacity Ulti-head (recently used by Fox Sports’ studio to bring coverage of the World Cup); and the lightweight, completely silent Whisper head.
We’ll also be in Hall 12, stand F.11, (map location) demonstrating the new high speed, lighter weight robot Bolt Jr., which makes its show debut! The compact 6-axis camera robot arm is ideal for smaller studios, film sets or on location. A smaller, lighter weight option to the Bolt, the Bolt Jr. is a perfect solution where space, weight and manoeuvrability are key. The Bolt Jr. is available on either pedestal or with track. The Bolt Jr. will be paired alongside the world’s fastest cinebot, the Bolt!
To see the Bolt Jr. in action enjoy the short showreel below by the incredibly talented Where It’s Greater
MRMC’s High Speed 360 rig was used to capture dynamic footage of Andy Murray for a new Jaguar commercial. To allow Andy the freedom to move and strike the ball for the required shots, a specially designed larger platform was built that allowed him to perform safely.
Creative agency Wing London turned to MRMC’s Rentals team when searching for the perfect solution for capturing their vision for a series of Jaguar Sponsored Wimbledon promotional films, featuring some of Britain’s top tennis stars – Andy Murray, Jo Konta and Kyle Edmund.
Shooting at 1,000fps, the camera travelled continuously around Andy capturing every detail of every movement. The entire spot was shot under a rain machine and to further enhance the dramatic effect, a light source mounted to the top of the system creating a powerful, dynamic look.
For any more information about our custom robotic rigs, get in touch
Capturing precise, dynamic action in high-speed with camera movement requires extensive levels of automation. Leaders in visual engineering, RiTE Media Group specialise in combining high-speed filmmaking with ground-breaking robotics, such as MRMC’s Bolt Cinebot, to create breathtaking visuals.
Collaborating with Director and Visual Engineer Steve Giralt, RiTE produced this beautiful short film – ‘Shoemaker’ – a visually arresting examination of the shoemaking process.
Steve Giralt commented “The Shoemaker is the story of the making of a leather shoe in a visually different way. A big part of my visual engineering storytelling process involves breaking down a thing or idea, examining its parts, and capturing the process of putting them back together. For Shoemaker, I examined the process of making a shoe and created a visual interpretation of that process through my eyes and the eye of the camera.“
RiTE also created a revealing behind the scenes video of the shoot, demonstrating the technical expertise and intricate, bespoke engineering solutions required to capture the amazingly detailed footage.
EXPLAINING THE HOW…
Colin Michael, a Visual Engineer at RiTE Media, explains in more detail how some of the shots were achieved here:
The approach is simple. It starts with understanding the shot we are after, breaking it down into its moving parts and working out the physics we need to control. The next step is to design a rig specifically tailored to the particular action to be automated for the shot.
When you’re shooting close up, macro high-speed action, it’s imperative that the camera motion control system and the actions of the subject are totally synchronised. This level of precision is made possible by MRMC’s Flair motion control system.
At RiTE we build our own specialised robots to work in tandem with our MRMC Bolt Cinebot, and the most satisfying part of building one of these bespoke shot-specific rigs is hooking it up to Flair and seeing it come to life. When I’m standing at the controls for the Bolt and our custom automation rigs, I often find myself thinking about how impossible it would be to pull this off by using human hands alone. Human reaction time cannot come close to the precision required to create these automated high-speed shots.
The pencil and Xacto blade shots required a horizontal motion across a surface as well as an additional axis to control pressure. We used a custom rig that I had initially built to pour and spin beverage bottles and added a regulated pneumatic cylinder to control the pen and knife pressure. The rig consists of a slider base driven by a 6-amp stepper motor, an MRMC model mover turntable mounted to that base, a custom 3D printed mount to house a mDrive 23 motor on the rotation platform, which finally drives a long rotating shaft. On the end of the long rotating shaft, we designed and printed a joint assembly that would hold the pencil or knife as well as the small actuator to control the pressure. This assembly was set up in Flair as three auxiliary axes, the actuator was triggered via Flair using MRMC’s trigger box.
THE HOLE PUNCH…
For the hole punch shot, we swapped the pencil & blade portion of the rig with a larger actuator fitted with the hole punch on the end. This allowed us to hover over the leather, punch a hole, move to the next point, punch another, and so on.
Being able to program the camera motion and the movement of the subject within the same interface was crucial to the success of these shots. This allowed us to spend more time focusing on other important details, such as lighting. Another slider was set up with a Hive 100c light mounted to the platform and we used a long actuator to send the light sliding across the background at very high speeds. This was also triggered using MRMC’s trigger box.
To create the hammer shot, we built a custom rig similar to a hold-down clamping actuator. We used Actobotics parts for the frame, a Festo actuator and a custom 3d printed clevis fitted with four shaft idler bearings for super smooth and repeatable hammer strikes.
For the sewing machine footage, we rigged a large Festo actuator to the same surface the machine was bolted down to. This was automated using Flair’s cyclic triggering mode which lets you repeat trigger on-off signals like a metronome, saving a lot of time by not having to manually input each command into a table with corresponding trigger commands. Overall, this project was a huge technical challenge but we’re all proud of the way it turned out!
Director: Steve Giralt | DP: Justin Dombrowski | Visual Engineer: Colin Michael Quinn | Producer: Jacob Kiesgen, Indiana Robbins | Edit: Nadine Mueller | Color: Dario Bigi | Sound Design: Paris Schulman, TJ Dumser, Drew Mullins | Musical Arrangement: Paris Schulman
We had the pleasure of supporting a fantastic new project to create the World’s First 24-hour Gigapixel panorama of the London skyline.
The contact lens provider, Lenstore, teamed up with VR & 360 production company Visualise, who, using the incredible Nikon D850 and our motion control robotic Ulti-head, captured the amazing timelapse from the top of Canary Wharf!
The technical aspects of this project were immense. Pinpoint accuracy was required to stitch the thousands of detailed photos together; without the absolute precision of the repeat passes, the images wouldn’t have seamlessly blended together – which is why the robotic Ulti-Head was ideal for the task!
Similarly, Nikon’s D850 was needed to ensure the highest degree of image sharpness, and its whopping 45.7 megapixels means that you can zoom into any part of the photo and pick up details nearly 5 miles away!
Each Panorama is made up of 260 individual photos and is 155 degrees wide (183,944 x 40,060 pixels) – which equates to the capturing of over 7 billion pixels per hour!
The results are incredible – the World’s First 24 Hour Gigalapse panorama!
The project was shot by Henry Stuart from Visualise, he had the following to say:
“Shooting gigapixel photos is hard – we have been shooting them for the Olympics, the World Cup, for events and places all around the globe. Each panorama is so large it needs specially built computers to process it. In this case we had to build a special server system and network all of the work stations in our studio to the content so that we could stitch five of the photos at a time. Lucky it was the winter as the heat generated was keeping our whole block warm.
So what makes this different is really its ambition – you would never think that this many gigapixels could be shot at this resolution in one day. On any other panoramic head you would not have the same alignment of pixels, they would all have some give or movement in one direction or another.
There was a team of two of us, taking shifts through the day/night. It was incredibly cold and windy. Each hour we made the trip to the corner of the roof, checked the light, adjusted our settings and set off the camera remotely. Then rushed back inside to warm up again. We were in a building control room, sandwiched between all their electrics and air conditioning controls.
To capture a photo like this you need a really capable camera – we used the Nikon D850. It has this beautiful big sensor and captures a huge range of light and dark (large dynamic range). This is so important when shooting panoramas where one part of the image is bright, such as towards the sun, and another is dark such as over the Thames. We shot everything on the camera’s ‘RAW’ setting, which keeps loads of extra information in the shots that you would usually lose.
The robotic head we used to take the images is from the world of film production, it’s technically a custom modified Mark Roberts Motion Control – Ulti-Head. This head was programmed to take the 260 photos of each photo to pixel precision, meaning each time the panorama is created, even 24 hrs later, the pixels have not moved and everything lines up.”