At MRMC we are proud of our major new product releases and concentrate so hard on them we often forget the significance of our many incremental feature additions and product upgrades that occur on a daily basis but may not be headline news or create a flurry of press releases. Recently we’ve been looking through our work over the past months and realised those small changes amounted to a lot, so we thought we’d list some. If you want more info on any of them just let us know:
- Developments and tests to make the Bolt High Speed rig 20% faster than before.
- Developing and Testing the new Version 6.0 of our Academy award winning motion control software.
- Turntable range increased from small to large, and most with speeds in excess of 100 degrees per second.
- Flair multi-controller created, allowing multiple motion control rigs to be controlled from a central source.
- Orbital XL rig created for super-sized e-commerce photography.
- Head and Flair Ethernet API created allowing any of our products to be controlled directly from 3rd party software.
- Improved or implemented interfacing to a number of 3rd party products, including DMX lighting controls, Dragonframe, 6-degrees-of-freedom joysticks, and various 3D tracking systems.
- Interfacing to Oculus Rift.
- Broadcast vertical columns for on-air moves.
- Etc. etc.
The list is long and lengthy, this is just a snapshot. With 6 full-time developers and several support engineers and freelancers, the development is rapid and ever-changing – so if you don’t immediately see what you need just ask.
Tel: +44 (0)1342 838000 | Email: email@example.com
Mark Roberts Motion Control will be returning to the annual showcase and celebration of professionalism and dedication to the world of
Stop-motion animation, Annecy 2012.
We attended for the first time only a couple of years ago, and are very proud and delighted to announce we will be returning again this year.
The Animoko Rig, will once again be making an appearance as will the S4 Stereoscopic Stepper. Both have proved to be extremely popular in the production of feature length Movies, and recently in terms of the Animoko for still life fashion shoots.
We will also be bringing the Pan/Tilt Head SFH-30 with Monorail slider to the show, which has been extremely popular this year, with many heads being used on the worldwide stage in perhaps the greatest sporting event in the calendar this year being hosted by London in 2012.
You will find us together with Stop Motion Pro at the Annecy Festival 6th to 8th June. We look forward to seeing you there.
Please do get in touch if you have any questions or would like to book a demonstration at the Mark Roberts Motion Control Studios.
The 7 Uses of Motion Control
(This is the full article previously published in parts)
|Did we say 7 uses? – Well that’s not quite true. There are in fact countless. To many to list in a newsletter in detail, but there are generally 7 main reasons for using motion control. These categories are listed below and we will be covering them all in greater detail.
The best known category of use is Repeat Moves and is the basis for all motion control. A good quality motion control system can repeat any camera movement with extreme accuracy, countless times. Once a camera can repeat the movement a whole range of effects can be created. One of the simplest effects is making elements appear or disappear. This is done by filming, for example, an empty room and then filming again with the motion control camera but this time with an actor or some furniture in the room. One can then take the 2 shots and easily mix between the two to make it appear that the furniture or actor is appearing or disappearing. Another possibility is to shoot the room again but this time have the same actor stand in a different position. Now in the finished shot one effectively has duplicated the actor so there are several copies of him on the screen. What about a crowd? Crowd replication is the same idea applied to a battle-field or a busy street scene where one requires hundreds of actors, or maybe identical cars or planes. Well why not take a handful of actors or one plane and simply shoot the same scene again and again with the actors or plane in different positions. Then when compositing it is easy to make it look like a huge number of people or even objects were involved. If one requires the backgound to change instead of the foreground, one could shoot an actor on green-screen and then use the motion control system to shoot a room and a street. Now it becomes very simple in post-production to make it appear that the background is changing for the actor from a room to a street. Motion control is also used when filming animals with other animals or humans that would otherwise be impossible to achieve. For example, imagine creating a shot of a baby playing next to a group of real lions. Or what about a leopard walking next to a zebra. Additionally, where a shot can only be done with an animal trainer controlling the animal, it is possible to shoot a “clean pass” of the scene. Once one has a clean pass, it is very simple and an almost automated process to remove the trainer from the scene to only leave the animal. Because Mark Roberts Motion Control systems are designed to be ultra accurate, one can shoot at different film speeds and combine the shots seamlessly. For example, one could have an actor walking and talking at live-action while everyone around him is moving in slow-motion or high-speed. The repeatability of the motion control camera means you can shoot at 4FPS (frames per second) and then again at 120FPS and when combining the two, it will be impossible to see a join. Another area that is covered by the Repeat Moves category is animation and mixing animation with live-action.
Another form of “scaling” is often used that does not actually involve a difference in scale, only in position or orientation. It is possible to tell the software to rotate the move in any direction. For example, if one has a box and one wants to make it appear that people are walking on many of its different sides, then one shoots a scene with an actor walking on top of the box. Then one shoots again but telling the motion control rig to rotate the move by 90 degrees. Now those 2 takes can be seamlessly composited to make it appear that one actor (or even the same actor) was walking on 2 different sides of the box, completely defying gravity. This trick could be done with staircases, or with rooms, of having people walking on the walls and ceiling (as can be seen on the Showreel – Motion Control Explained).
Another use of controlled moves, is when using complex lenses or even macro lenses. For some shots, one requires a snorkel lens (a lens with a right angle turn in it) or a long boroscope lens, to get into very tight spots or in between scale models (e.g. models of buildings). Controlling a camera with such lens by hand is very complex, but by using a motion control rig, one can program in the move very quickly and then get the shot. The move is easily adjusted and edited to get the exact move required. Similarly, if shooting with a macro lens, when shooting very close to an object, all camera movements become greatly magnified. It becomes quite hard to move a camera by hand without the shot looking unsteady, wobbly or shakey. But a motion control rig controls the camera movement very accurately down to fractions of millimetres, so extreme close-ups are easy to program and shoot, and of course the lighting can also be adjusted to suit the shot.
Frame-by-frame animation although discussed in part 1 – Repeat Moves – is also closely related to the controlled moves use of motion control. The whole camera move is programmed into the motion control rig, and all the focus and lighting adjusted as one continous move before shooting the scene using stop-frame animation.
Mark Roberts Motion Control is the only company in the world to design and create all the parts of a motion control system, including the mechanics and the software. This means that features have been built into the Flair software that are not possible on any other system. It has such an accurate model of the mechanics that it also gets used for real-time applications such as Virtual Studios or on-the-set graphics previews, where the CGI elements are added in real-time to the live video footage, according to the real-time XYZ data from Flair.
One could film an actor in a live set and use the CGI export data to add foreground elements for the actor to interact with, or one could additional background elements. Adding additional background elements is sometimes refered to as digital matte painting, where a graphics artist creates a model of scenes in the distance that don’t require as much detail as the scenes shot with a camera in the foreground. This is often used for feature films, such as Gladiator, A Knight’s Tale, or Star Wars, where a live scene is filmed, sometimes using a scale model, and then the background is changed to have some amazing city, skyline, or mountain in the background, and because the camera XYZ’s are known precisely, once the digital matte painting is created, adding it with the right scale and perspective is easy.
When exporting CGI data, motion control moves do not have to be pre-programmed. One may need to accurately follow an actor or an animal, or the director may want a very erratic, “human” or steady-cam looking move, and these are all easily done using remotehandwheels or the popular Grip-Sticks which allow a Director of Photography to simply push the motion control camera by hand, as if it were hand-held or on a crane, all the while recording the movement and the camera position for exporting to CGI.
CGI Import is the term given to any move data that is transferred from 3D CGI (Computer Generated Images) software to a motion control camera. Because the Flair motion control software has a very accurate Inverse-Kinematics model of the rig moving the camera, including the exact parameters of the optical lens, it is very simple for it to move the camera to any 3D position in space (referred to asXYZ position).
Effectively, one can plan and storyboard a whole shot in software packages such as Softimage(tm) or Maya(tm), and then arrive on set and have the camera achieve the exact same shot. Because moves have been created in a graphics environment, complex effects can be achieved. Additionally, because everything about the shot is known beforehand, less waste is achieved on set (by not building oversize sets or backgrounds, only getting the exact lights and grip equipment required, having less planning and unknowns occurring on-set) thereby reducing actual production costs.
This whole action of pre-planning moves is referred to as Pre-visualisation, and is becoming common place on large feature film productions as well as commercials and music promos.
Pictures courtesy of Pixel Liberation Front
The Panic Room is one of many recent productions, including Harry Potter, K19, Dinotopia, Black Hawk Down, and Spiderman, that makes heavy use of pre-visualisation. Using Softimage|XSI, Pixel Liberation Front managed to create an animated CG version of the 105-page script of Panic Room for David Fincher’s latest Columbia Tri-Star movie. It was used as the primary animation tool to pre-visualize the entire set, camera positionings and movements, as well as character moves, in order to facilitate the actual shoot and production of the film. It was a creative tool for Fincher and it provided an enormous amount of detailed technical, and pre-production information for every aspect of making the movie – from art direction to music scoring to set construction.
Ron Frankel, PLF’s previsualization supervisor explains:
“David Fincher’s goal was to produce a coherent 3-D animated and edited cut of the film before the shooting even began. The scenes we created with XSI included the set, characters, key props and set dressing. We were able to animate the scenes with XSI to indicate both character blocking and camera movement, giving a sense of how individual shots would look before they were filmed and how they would be edited together into a sequence. The 3-D scenes were highly interactive, allowing film director David Fincher to refine actor blocking, timing, camera position and other factors, and immediately see the results. The animation was also used to derive a wealth of information relative to set construction, camera equipment needs, set dressing and so forth, making it an invaluable asset to the production crew. This saved time and money that would otherwise have been spent on set configuration, camera placement and rehearsals. ”
Motion control can be used to match camera array shots. Camera Array shots are also known as frozen moment or time-slicing orbullet-time (made famous in The Matrix). Because the camera array represents a moving camera path the same path can be defined in a motion control move. This allows all of the other effects that are possible with motion control to be combined with frozen moments. For example, a live action pass filmed with motion control allows for the insertion of a moving person into a frozen scene.
Motion control can also be used to get into and out of frozen moment shots seamlessly. A camera move can begin with a motion control move and switch at some point to the camera array. The motion control system moves the motion picture camera¹s position from a start position to the first position of the camera array, at which point the camera array is triggered. In post production a straight cut joins the two shots. Dayton Taylor¹s company, Digital Air (http://www.virtualcamera.com), has produced a special Arri mount lens with a mirror in front of it that allows a motion picture camera¹s point of view to connect with his Timetrack camera without the two cameras crashing into each other at the transition point.
Because frozen moment shots often use interpolation to create in-between frames, the number of cameras in the array does not necessarily equal the number of frames in the final shot. Interpolation is used to overcome camera spacing limitations. Frames can be interpolated that appear to have been recorded from camera positions between the actual cameras. When integrating camera array shots with motion control, scenes can be designed that take this into account. For example, even a small (ten camera) array can produce a significant spatial perspective shift. If the ten camera array shot is interpolated to sixty frames then the corresponding match move motion control shot should record all sixty frames, not just the ten frames that match the camera array¹s individual camera positions. This is particularly useful because it allows interpolation to be used only on the elements which require it and motion control can be used to produce the footage of the other elements in the shot such as background and moving elements.
The photos below show Digital Air’s camera array being used with the Milo motion control rig, for a Schweppes commercial. Director Jon Hollis from Smoke and Mirrors, London, used Timetrack cameras for product shots and a Milo for matched moves of the background, which included the commercial’s main character, a talking leopard:
Specific Music Video Effects – Audio Timecode Triggering
A similar in-studio effect is achieved in the recent award winning Outkast Hey-Ya. Another example is where very fast camera moves are required and the speed of the timecode is actually slowed down during shooting. When the footage is then played back the actors appear to be singing in realtime but the camera motions around them are extremely fast.
*** Don’t Forget the Moco Forum ***
Remember everyone is welcome to become a free member of the online motion control forum. It’s a great place to ask questions about motion control and how to get things done on set or off. We want to see this service get used as much as possible so join today. http://www.mocoforum.com
|Do you know anyone else who should be getting regularly informed about the industry? Let us know; we would be happy to send them our newsletters or DVD Showreel – The 2006 Motion Control Explained DVD. Email firstname.lastname@example.org to request one.If you would like to have more information about CGI, remote heads, cranes, dollies, accessories or any other filming equipment please let us know at email@example.com
Mark Roberts Motion Control Ltd.
Oscar Winning Techniques for the Golden Compass
In the last newsletter we told you about the success of The Golden Compass with its Oscar winning special effects. As part of creating Iorek the polar bear, Framestore CFC commissioned Ian Menzies from Motion Control Cameras to build a mechanical structure which would move with similar motion to a walking or galloping bear when mounted by a rider. This consisted of a 6-degree of freedom motion base platform and several additional axes of motion, all controlled by Flair motion control software. The motions were programmed and pre-visualized in 3D to relate to the motions of the bear, and transferred to Flair for it to exactly duplicate the motions with the actors, so that their motion would match that of the computer animated bear.
Here is some footage from the pre-visualization showing only the mechanical skeleton without the bear flesh, as well as a shot of a test shoot with the Cyclops rig and the motion base in the foreground.
*** S3 3D Stereoscopic Animation Unit ***
It is now 12 months since the S3 3D Stereoscopic Stepper/Slider unit was released and in that time it has gone from strength to strength being used in everything from commercials, to feature films and TV series and in all corners of the globe. Being compatible with MACs and PCs and all major animation and motion control systems, as well as all standard DSLRs has made it very popular and its affordability has meant anyone can get their hands on one.
Since its first release there have been numerous software upgrades which have added new features and given performance improvements. The unit itself has been upgraded mechanically since its first release with a stronger motor and strengthened chassis and is capable of carrying large weights even at an angle (for example 4kg at 30 degrees inclined or 3kg at 45 degrees). It can even lift almost 2kg at a degree incline all from the humble USB port. It has been tested with MAC OS, Windows XP/Vista/Win 7 32-bit & 64-bit.
Soon the release of the new S4 unit will offer customers the ability to have a 3D Stereoscopic unit capable of having a convergence option (more info to follow in later updates).
Aardman Animations, currently working on their next feature film had this to say: “At Aardman Animations we have been using the S3 stereoscopic stepper unit for 3D animation on our latest feature film, currently still in production. MRMC have been very helpful adapting the unit to suit our needs, such as adding a step and direction output so we can utilise different motion control equipment and 3D trackers with the units. We have been using up to 30 units in our studios on different shots, controlling them via the USB serial communications by Stop Motion Pro 7 creating a complete stereo solution taking both frames and moving the camera with a single button push. So far the S3 has greatly increased the speed of shooting and helped create accurate 3D images adequate for theatre projection.” – Morgan Roe Production Technical Engineer
Bjoern Gottwald used his S3 to create his Bachelor thesis at the University of Applied Science in Augsburg, Germany. We were really impressed when he proudly sent us footage of his “trailer” called Embers set in a post-apocalyptical world, where humans are outnumbered by machines…. Well worth seeing, click here.
For more information on the 3D Stereoscopic Unit click here.
Syndicate Milo – just add water
Syndicate Entertainment AB – a VFX/moco facility based in Stockholm, Sweden – recently produced a promotional film for the release of Autodesk REVIT, a new architectural software. The production was driven from an animatic that was done in collaboration with the creative team from the agency. The challenge was to create a film that would show the workflow through the different program modules in REVIT in a way that would be appealing to architects, engineers and everyone else in the construction chain. The film was part of the internet release promotion of the REVIT program where water was set to be the main character as a symbol for the software’s dynamics and ease of use.
Instead of using computer generated water the Syndicate team decided at an early stage to try to shoot real water pouring through small sets designed like the interface of the software. So when the animatic was made all 3D-models in the sketch film were scaled to desired size and then sent to a set designer who in turn used the 3D-models to print them in a 3D printer.
For Motion Control Syndicate used Mark Robert’s Milo Motion Control rig controlled using Syndicate’s proprietary 3dsmax Milo Plugin, that not only exports the camera moves from 3D to Flair but also controls all the axes of the rig. This is a great tool to use for scene to scene transfers, scaling shots, rig-flips etc.
For this project crucial things like narrow passages were easily detected at design-time and each set could be scaled to “fit the Milo”. One example of such a scene is the end scene with the dam. Built and scaled just big enough to let the Guiness head fit with just a few centimetres margin while still seamlessly fitting together with a long series of continuous moves at different scales.
S4 Stereoscopic 3D Stepper with Convergence
Shown here in a left frame and right frame position with a change of convergence
Mark Roberts Motion Control have taken the ever popular 3D Stereoscopic stepper to new heights and are pleased to announce the release of the new S4. The S4 is an ideal unit for anyone shooting 3D Stereoscopic animation, to automatically move and trigger the camera to capture the left and right eye images. The S4 is the larger brother to the S3 released last year and includes an optional Convergence module. Some of the features of the S4 include stronger motor, gearbox and chassis for larger camera packages in any orientation, built in matte bar/focus rod holders, USB communication to animation software, control of both interocular distance (distance between left and right eye), and convergence (angle of view between left and right eye). The Convergence module clips on allowing the S4 to be used with or without it.
If you would like to have more information please email firstname.lastname@example.org
Front on shot of the S4
*** Samsung 3D TV commercial, A New Dimension ***
Rogue Films approached Ian Menzies with the project of shooting a commercial for Samsung’s first LED 3D TV. The aim was to create a 3D feel to a 2D image. The Director, Sam Brown and DP, Damien Morisot wanted to have a handheld look to all of the camera moves shot with the motion control rig. After several meetings and tests at MRMC the theory and method was proven.
The location for the shoot was Buenos Aires, Argentina. The commercial was shot on film with an Arri 435. Three days were spent on location shooting TV monitors with blue screens that had been installed around the city, in parks, bus stops, outside shops, on sidewalks, etc. Three days were then spent in the studio shooting the content for the TVs.
Ian Menzies said, “We took the moves shot on location and repeated these for the TV content. In some cases the move had to be scaled, for example the popcorn shot. After shooting a long track shot at the location, we then shot the popcorn for the content of the TV. The popcorn was in a small box, so we scaled the move down 12 times to make it look large on screen and then shot it fast at 125 FPS to slow it down on screen”
Once composited the resulting footage made it so that as you are watching the TV and you move around the TV, the content on the TV moves in relation giving a sense of relative perspective and creating a
The motion control rig for the job was the Talos, quick to set up for the location shooting and ideal for the required studio shooting. MRMC Mimic Pan Bars were used by the DP to control the pan and tilt of the camera and give the shots the handheld feel they wanted to achieve.
*** S3 3D Stereoscopic Animation Unit ***
The new S3 Stereoscopic Stepper Unit is now in full production and proving to be a very popular solution for animators requiring a solution for stereoscopic (3D) stop-frame animation. With the success of 3D films in the last 12 months, the requirement for new shoots to be in 3D is growing dramatically. Features that were planned for 2D production are gearing up for being shot in 3D instead and the requirement for Stereoscopic filming equipment is growing exponentially. Mark Roberts Motion Control is helping to fill the requirements of the creatives by producing a range of products for filming in 3D. As well as adding new Stereoscopic features in the newest version of Flair (our motion control software) to allow for programming interocular and convergence on the motion control rigs, the new S3 Stereoscopic unit is proving professional stereoscopic animation is within anyone’s grasp, from education establishments to full-fledged feature film productions. For those who have not yet seen the S3 on our website or seen some of the demos, please visit our soon to expand 3D website (www.filming3D.com) as soon as possible to acquaint yourself with this ingenious unit It allows DSLR cameras to easily be used for 3D animation by giving the user full control over the left and right eye interocular distance. The S3 can interface to all the popular animation programs and has drivers for PCs and Macs and is designed to offer precision and portability at an incredibly low price. In the coming months more products will be coming out including larger units for heavier camera packages. If you have a 3D filming requirement please feel free to contact us to see what equipment we can offer you.
Picture courtesy of StopMotion Pro showing the S3 at Annecy 2010
being used with StopMotion Pro to give instant stereoscopic