We had the pleasure of supporting a fantastic new project to create the World’s First 24-hour Gigapixel panorama of the London skyline.
The contact lens provider, Lenstore, teamed up with VR & 360 production company Visualise, who, using the incredible Nikon D850 and our motion control robotic Ulti-head, captured the amazing timelapse from the top of Canary Wharf!
The technical aspects of this project were immense. Pinpoint accuracy was required to stitch the thousands of detailed photos together; without the absolute precision of the repeat passes, the images wouldn’t have seamlessly blended together – which is why the robotic Ulti-Head was ideal for the task!
Similarly, Nikon’s D850 was needed to ensure the highest degree of image sharpness, and its whopping 45.7 megapixels means that you can zoom into any part of the photo and pick up details nearly 5 miles away!
Each Panorama is made up of 260 individual photos and is 155 degrees wide (183,944 x 40,060 pixels) – which equates to the capturing of over 7 billion pixels per hour!
The results are incredible – the World’s First 24 Hour Gigalapse panorama!
The project was shot by Henry Stuart from Visualise, he had the following to say:
“Shooting gigapixel photos is hard – we have been shooting them for the Olympics, the World Cup, for events and places all around the globe. Each panorama is so large it needs specially built computers to process it. In this case we had to build a special server system and network all of the work stations in our studio to the content so that we could stitch five of the photos at a time. Lucky it was the winter as the heat generated was keeping our whole block warm.
So what makes this different is really its ambition – you would never think that this many gigapixels could be shot at this resolution in one day. On any other panoramic head you would not have the same alignment of pixels, they would all have some give or movement in one direction or another.
There was a team of two of us, taking shifts through the day/night. It was incredibly cold and windy. Each hour we made the trip to the corner of the roof, checked the light, adjusted our settings and set off the camera remotely. Then rushed back inside to warm up again. We were in a building control room, sandwiched between all their electrics and air conditioning controls.
To capture a photo like this you need a really capable camera – we used the Nikon D850. It has this beautiful big sensor and captures a huge range of light and dark (large dynamic range). This is so important when shooting panoramas where one part of the image is bright, such as towards the sun, and another is dark such as over the Thames. We shot everything on the camera’s ‘RAW’ setting, which keeps loads of extra information in the shots that you would usually lose.
The robotic head we used to take the images is from the world of film production, it’s technically a custom modified Mark Roberts Motion Control – Ulti-Head. This head was programmed to take the 260 photos of each photo to pixel precision, meaning each time the panorama is created, even 24 hrs later, the pixels have not moved and everything lines up.”
PRESS RELEASE – AFC100 Head – Accurate, Fast, Compact 3G HD Broadcast head released
As part of its expansion of broadcast-related products and services, Mark Roberts Motion Control are releasing the AFC100 compact robotic head. The world’s first broadcast head with the option for full 3G HD-SDI capable slip-rings allowing unlimited, cable-free panning, as well as full Ethernet control.
The new head has evolved from the success of the company’s range of versatile robotics including the Robo Head, SFH-30, SFH-50 and Ulti Head.
Offered with next generation slip rings, near silent operation and precision servos, the new head is aimed at servicing a wide range of demanding creative applications. A bespoke version of the AFC100 has already been commissioned for a medical facility, employing nine networked MRMC heads and a 3G HD workflow.
Managing Director, Assaff Rawner, commented, “This new head is testament to our focus on delivering high-end technology that meets new TV production standards, whilst ensuring we continue to provide a tailored and personalised service for our customers.”
MRMC products and staff will be at BVE London (Stand K20 – Nikon) and NAB 2013.
The next installment of our series of mini-tutorials has just been uploaded to the Mark Roberts Motion Control YouTube site, this time the video features a tutorial on the joystick controller or (MSA-20).
The Joystick controller allows our heads to be directly controlled as a remote or repeat head. The Joystick console can record multiple moves and store them in its memory for later playback. The joystick is a 3-axis joystick allowing 3 axes to be controlled and additional inputs exist should the user wish to control more than 3 motors. The Joystick controller is compact and lightweight for easy transportation and can be battery powered just like the heads. It only requires a single small data lead to the head as well as power so cabling between the controller and the head is very simple
Please click on the picture below to see the Video.
For more information please contact Mark Roberts Motion Control:
+441342 838 000
Mark Roberts Motion Control were in LA once more for the recent CineGear Expo at Paramount Studios.
If you came to the Expo we would like to say thank-you for visiting us on the MRMC / Camera Control inc stand and speaking with our representative from Mark Roberts Motion Control Ltd. We trust you found the show interesting and informative.
On display on our booth were some of our smaller product line heads and track systems including the SFH-50 Pan-Tilt head with MSA-20 Handwheels control, the SFH-50 with Roll via Joystick control, and an SFH 30 with Monorail track capable of diagonal, vertical or horizontal motion. All of these systems are ideal for smaller HD cameras such as the RED One and Epics, the Alexa or DSLRs for live action motion control, remote camera work, time lapse and many other applications.
If you require additional information or a quote for any of our motion control rigs, heads or other equipment or would like a copy of our show-reel, please let us know.
50% of our work is custom built to bespoke requirements, and to that end please do not hesitate to get in touch should you have any further questions about the equipment you saw at the expo or any of our other products.
Contact: Mark Roberts Motion Control
T: 01342 838 000
Mark Roberts Motion Control are attending Cine Gear Expo again in 2012.
Look out for an exciting display of small heads and accessories all available right now. Some of these heads (15 in fact) are being used in the upcoming international sporting spectacle taking place in London England.
Cine Gear will be taking place at Paramount Studios Hollywood, CA.
We look forward to seeing you there. You can meet a representative from
Mark Roberts Motion Control at booth 60.
Mark Roberts Motion Control are leaders in the world of Motion Control Products, and have been supplying cutting edge Rigs and equipment to the industry for nearly 50 years*. Mark Roberts Motion Control rigs are now the defacto systems used around the world for feature films, music videos and commercials, with Broadcast solutions and other related industries now also selecting MRMC Rigs and Heads as their first choice.
We are proud to be the official UK supplier
of Robotic Equipment to Nikon at the Olympics.
Famously our Rigs and Heads have been used by Aardman in their feature films and TV Hit Wallace & Grommit, as well as more recent productions such as “Pirates, in an adventure with Scientists”.
Our goal is to broaden the horizons of Film and Television production by supplying Motion Control equipment which enables creative moves to be entered quickly and easily, and greatly increasing creative potentials and realizations.
To that end we are constantly seeking out and working with companies that push the envelope and are playing a major part in the shaping of Motion Control and Motion Control Techniques.
Stop Motion Pro worked very closeley with us in developing the interfacing of the S4 and with their input the MRMC S4 stereoscopic stepper is now fully compatible with Stop Motion Pro.
Flair,The Ultimate Camera Control Software was developed in house and has been used for many years by MOCO Operators and professionals.
“Unlike most other Motion Control Software, Flair is written for film-making and is a winner from a film-makers point of view. Camera Moves are easily programmable and highly adjustable, with a variety of options to change the move’s parameters. the front end software is clear and the graphic displays are very helpful”
— Oliver Kunz, Studio Chief, Brains and Pictures, Vienna
The perfect partner to our Pan/Tilt Heads and Rigs has just been released fully enabled for Ethernet control. Thus making it more accessible to such environments as TV Broadcast and both studio based or outside filming.
Tim Richardson, a key member of the original line-up that released Flair and newly returned to our development team, recently announced:
“These days we sell Flair software ready to install on a laptop plus the head itself. It really is as simple as plug-and-play”.
*in 1966 originally the company was called mark Roberts Film Services.
Mark Roberts Motion Control will be returning to the annual showcase and celebration of professionalism and dedication to the world of
Stop-motion animation, Annecy 2012.
We attended for the first time only a couple of years ago, and are very proud and delighted to announce we will be returning again this year.
The Animoko Rig, will once again be making an appearance as will the S4 Stereoscopic Stepper. Both have proved to be extremely popular in the production of feature length Movies, and recently in terms of the Animoko for still life fashion shoots.
We will also be bringing the Pan/Tilt Head SFH-30 with Monorail slider to the show, which has been extremely popular this year, with many heads being used on the worldwide stage in perhaps the greatest sporting event in the calendar this year being hosted by London in 2012.
You will find us together with Stop Motion Pro at the Annecy Festival 6th to 8th June. We look forward to seeing you there.
Please do get in touch if you have any questions or would like to book a demonstration at the Mark Roberts Motion Control Studios.
We attended the Broadcast Video Expo in London Earls Court earlier this year. If we saw you there we appreciated the chance to meet with you and show some of our products including the SFH-30 pan tilt head with joystick and pan bar controller.
We also showcased the SFH-30 Head with Joystick Console and
Pan-Bars at the Focus on Imaging Exhibition in Birmingham where the Nikon Booth was simply packed out as seen in this shot below:
More recently we were with Nikon at the Counter Terror Expo in London.
The equipment we had on display at the expo was the SFH-30 Pan-Tilt remote head carrying a D4 with a 400mm zoom for surveillance type applications, it is also available in a silent or “stealth” version. We hope you found that this was an interesting item to be displayed and showed some potential for you in acquiring one.
50% of our work is custom built to bespoke requirements, and to that end please do not hesitate to get in touch should you have any further questions about the equipment you may have seen if you were at any of these Exhibitions, or any of our other products.
As part of our ongoing development of equipment being used in this year’s Olympics, here is a sneak peak of the early underwater prototype, designed to go in the swimming and diving pools. It will also work in the sea for underwater filming or time-lapse. Not shown on this model is the underwater focus control.
The 7 Uses of Motion Control
(This is the full article previously published in parts)
|Did we say 7 uses? – Well that’s not quite true. There are in fact countless. To many to list in a newsletter in detail, but there are generally 7 main reasons for using motion control. These categories are listed below and we will be covering them all in greater detail.
The best known category of use is Repeat Moves and is the basis for all motion control. A good quality motion control system can repeat any camera movement with extreme accuracy, countless times. Once a camera can repeat the movement a whole range of effects can be created. One of the simplest effects is making elements appear or disappear. This is done by filming, for example, an empty room and then filming again with the motion control camera but this time with an actor or some furniture in the room. One can then take the 2 shots and easily mix between the two to make it appear that the furniture or actor is appearing or disappearing. Another possibility is to shoot the room again but this time have the same actor stand in a different position. Now in the finished shot one effectively has duplicated the actor so there are several copies of him on the screen. What about a crowd? Crowd replication is the same idea applied to a battle-field or a busy street scene where one requires hundreds of actors, or maybe identical cars or planes. Well why not take a handful of actors or one plane and simply shoot the same scene again and again with the actors or plane in different positions. Then when compositing it is easy to make it look like a huge number of people or even objects were involved. If one requires the backgound to change instead of the foreground, one could shoot an actor on green-screen and then use the motion control system to shoot a room and a street. Now it becomes very simple in post-production to make it appear that the background is changing for the actor from a room to a street. Motion control is also used when filming animals with other animals or humans that would otherwise be impossible to achieve. For example, imagine creating a shot of a baby playing next to a group of real lions. Or what about a leopard walking next to a zebra. Additionally, where a shot can only be done with an animal trainer controlling the animal, it is possible to shoot a “clean pass” of the scene. Once one has a clean pass, it is very simple and an almost automated process to remove the trainer from the scene to only leave the animal. Because Mark Roberts Motion Control systems are designed to be ultra accurate, one can shoot at different film speeds and combine the shots seamlessly. For example, one could have an actor walking and talking at live-action while everyone around him is moving in slow-motion or high-speed. The repeatability of the motion control camera means you can shoot at 4FPS (frames per second) and then again at 120FPS and when combining the two, it will be impossible to see a join. Another area that is covered by the Repeat Moves category is animation and mixing animation with live-action.
Another form of “scaling” is often used that does not actually involve a difference in scale, only in position or orientation. It is possible to tell the software to rotate the move in any direction. For example, if one has a box and one wants to make it appear that people are walking on many of its different sides, then one shoots a scene with an actor walking on top of the box. Then one shoots again but telling the motion control rig to rotate the move by 90 degrees. Now those 2 takes can be seamlessly composited to make it appear that one actor (or even the same actor) was walking on 2 different sides of the box, completely defying gravity. This trick could be done with staircases, or with rooms, of having people walking on the walls and ceiling (as can be seen on the Showreel – Motion Control Explained).
Another use of controlled moves, is when using complex lenses or even macro lenses. For some shots, one requires a snorkel lens (a lens with a right angle turn in it) or a long boroscope lens, to get into very tight spots or in between scale models (e.g. models of buildings). Controlling a camera with such lens by hand is very complex, but by using a motion control rig, one can program in the move very quickly and then get the shot. The move is easily adjusted and edited to get the exact move required. Similarly, if shooting with a macro lens, when shooting very close to an object, all camera movements become greatly magnified. It becomes quite hard to move a camera by hand without the shot looking unsteady, wobbly or shakey. But a motion control rig controls the camera movement very accurately down to fractions of millimetres, so extreme close-ups are easy to program and shoot, and of course the lighting can also be adjusted to suit the shot.
Frame-by-frame animation although discussed in part 1 – Repeat Moves – is also closely related to the controlled moves use of motion control. The whole camera move is programmed into the motion control rig, and all the focus and lighting adjusted as one continous move before shooting the scene using stop-frame animation.
Mark Roberts Motion Control is the only company in the world to design and create all the parts of a motion control system, including the mechanics and the software. This means that features have been built into the Flair software that are not possible on any other system. It has such an accurate model of the mechanics that it also gets used for real-time applications such as Virtual Studios or on-the-set graphics previews, where the CGI elements are added in real-time to the live video footage, according to the real-time XYZ data from Flair.
One could film an actor in a live set and use the CGI export data to add foreground elements for the actor to interact with, or one could additional background elements. Adding additional background elements is sometimes refered to as digital matte painting, where a graphics artist creates a model of scenes in the distance that don’t require as much detail as the scenes shot with a camera in the foreground. This is often used for feature films, such as Gladiator, A Knight’s Tale, or Star Wars, where a live scene is filmed, sometimes using a scale model, and then the background is changed to have some amazing city, skyline, or mountain in the background, and because the camera XYZ’s are known precisely, once the digital matte painting is created, adding it with the right scale and perspective is easy.
When exporting CGI data, motion control moves do not have to be pre-programmed. One may need to accurately follow an actor or an animal, or the director may want a very erratic, “human” or steady-cam looking move, and these are all easily done using remotehandwheels or the popular Grip-Sticks which allow a Director of Photography to simply push the motion control camera by hand, as if it were hand-held or on a crane, all the while recording the movement and the camera position for exporting to CGI.
CGI Import is the term given to any move data that is transferred from 3D CGI (Computer Generated Images) software to a motion control camera. Because the Flair motion control software has a very accurate Inverse-Kinematics model of the rig moving the camera, including the exact parameters of the optical lens, it is very simple for it to move the camera to any 3D position in space (referred to asXYZ position).
Effectively, one can plan and storyboard a whole shot in software packages such as Softimage(tm) or Maya(tm), and then arrive on set and have the camera achieve the exact same shot. Because moves have been created in a graphics environment, complex effects can be achieved. Additionally, because everything about the shot is known beforehand, less waste is achieved on set (by not building oversize sets or backgrounds, only getting the exact lights and grip equipment required, having less planning and unknowns occurring on-set) thereby reducing actual production costs.
This whole action of pre-planning moves is referred to as Pre-visualisation, and is becoming common place on large feature film productions as well as commercials and music promos.
Pictures courtesy of Pixel Liberation Front
The Panic Room is one of many recent productions, including Harry Potter, K19, Dinotopia, Black Hawk Down, and Spiderman, that makes heavy use of pre-visualisation. Using Softimage|XSI, Pixel Liberation Front managed to create an animated CG version of the 105-page script of Panic Room for David Fincher’s latest Columbia Tri-Star movie. It was used as the primary animation tool to pre-visualize the entire set, camera positionings and movements, as well as character moves, in order to facilitate the actual shoot and production of the film. It was a creative tool for Fincher and it provided an enormous amount of detailed technical, and pre-production information for every aspect of making the movie – from art direction to music scoring to set construction.
Ron Frankel, PLF’s previsualization supervisor explains:
“David Fincher’s goal was to produce a coherent 3-D animated and edited cut of the film before the shooting even began. The scenes we created with XSI included the set, characters, key props and set dressing. We were able to animate the scenes with XSI to indicate both character blocking and camera movement, giving a sense of how individual shots would look before they were filmed and how they would be edited together into a sequence. The 3-D scenes were highly interactive, allowing film director David Fincher to refine actor blocking, timing, camera position and other factors, and immediately see the results. The animation was also used to derive a wealth of information relative to set construction, camera equipment needs, set dressing and so forth, making it an invaluable asset to the production crew. This saved time and money that would otherwise have been spent on set configuration, camera placement and rehearsals. ”
Motion control can be used to match camera array shots. Camera Array shots are also known as frozen moment or time-slicing orbullet-time (made famous in The Matrix). Because the camera array represents a moving camera path the same path can be defined in a motion control move. This allows all of the other effects that are possible with motion control to be combined with frozen moments. For example, a live action pass filmed with motion control allows for the insertion of a moving person into a frozen scene.
Motion control can also be used to get into and out of frozen moment shots seamlessly. A camera move can begin with a motion control move and switch at some point to the camera array. The motion control system moves the motion picture camera¹s position from a start position to the first position of the camera array, at which point the camera array is triggered. In post production a straight cut joins the two shots. Dayton Taylor¹s company, Digital Air (http://www.virtualcamera.com), has produced a special Arri mount lens with a mirror in front of it that allows a motion picture camera¹s point of view to connect with his Timetrack camera without the two cameras crashing into each other at the transition point.
Because frozen moment shots often use interpolation to create in-between frames, the number of cameras in the array does not necessarily equal the number of frames in the final shot. Interpolation is used to overcome camera spacing limitations. Frames can be interpolated that appear to have been recorded from camera positions between the actual cameras. When integrating camera array shots with motion control, scenes can be designed that take this into account. For example, even a small (ten camera) array can produce a significant spatial perspective shift. If the ten camera array shot is interpolated to sixty frames then the corresponding match move motion control shot should record all sixty frames, not just the ten frames that match the camera array¹s individual camera positions. This is particularly useful because it allows interpolation to be used only on the elements which require it and motion control can be used to produce the footage of the other elements in the shot such as background and moving elements.
The photos below show Digital Air’s camera array being used with the Milo motion control rig, for a Schweppes commercial. Director Jon Hollis from Smoke and Mirrors, London, used Timetrack cameras for product shots and a Milo for matched moves of the background, which included the commercial’s main character, a talking leopard:
Specific Music Video Effects – Audio Timecode Triggering
A similar in-studio effect is achieved in the recent award winning Outkast Hey-Ya. Another example is where very fast camera moves are required and the speed of the timecode is actually slowed down during shooting. When the footage is then played back the actors appear to be singing in realtime but the camera motions around them are extremely fast.