Struggling to explain the difference between virtual production, virtual reality and virtual studios? Or too embarrassed to ask what separates Mo-Cap from Mo-Sys? You’re likely not alone. The rise of virtual production has led to a raft of terms and acronyms that can confuse even the most experienced professionals.
A broad term for anything that uses computer-generated 3D images in real-time during filming. It encompasses green screen-based virtual studios as well as LED volume stages (see below). Virtual productions use camera tracking technology and game engines to allow programme makers to mix live footage with 3D images at once on a set.
Virtual productions that use LED volumes have grown in popularity in recent years, with The Mandalorian the most famous example. Actors perform on a physical set in a studio in front of a giant array of LED screens which feature images of realworld or computer-generated environments. The set-up is not unlike the projected backgrounds filmmakers have been using since the silent era. The difference is that it's smart. Harnessing a real-time engine, it responds to the movement of the on-set camera by adjusting the perspective, lighting, and other elements within the panels. The technology allows filmmakers to capture effects in-camera and real-time and lends itself best to single camera shoots – such as film, drama and commercials.
A multicamera studio that uses game engines, camera tracking technology and green screens to place presenters and audiences in realistic 3D sets that can be manipulated in real-time. dock10 employs its virtual studio technology to create virtual sets for BBC Bitesize Daily, Match of the Day and the FIA GT World Championships. Virtual studios are often used for live news and sports programmes. Virtual studios allow productions to create more imaginative sets, can help save money on transportation and storage, are more environmentally friendly and are quicker to set up than real sets. Despite the set being virtual, you can still have multiple camera angles and superimpose additional images.
A cross between a traditional and a virtual set. A hybrid set has many of the physical elements that feature in a traditional set, such as desks, chairs and staging – but is enhanced with a virtual backdrop. This allows, say, entertainment shows to make their physical sets seem much bigger, giving productions a bigger bang for their buck. In essence, a hybrid set is a digital set extension.
A technology that converts a person, object, or place into 3D digital data and reproduces it as a high-quality 3D image. To do this, many precisely synchronised cameras surround the person or object and capture them from many angles. The recorded video is processed through reconstruction software, which results in a 3D avatar of the performer - a solid, moving model of the performer which can be viewed from all angles. Sky Sports has used the technology to enhance its golf and cricket coverage.
Stands for motion capture, the art of capturing the physical movement of a person and translating it into the action of a computer-generated 3D character on screen to give it greater levels of realism. There are two main types of motion capture technology. Inertial motion capture tracks the positional data of a performer using motion sensors attached to a capture suit. Optical tracking allows cameras to track reflective stickers on a performer's capture suit. Optical tracking is sometimes referred to as an ‘outside in’ tracking solution as the cameras are placed outside the performance area tracking an object inside.
Refers to the capturing of the physical movements of a person's face and translating it into a digital model. A head mounted camera records facial expressions and / or dialogue to fully capture the intricacies of a performance for a digital character, from blinks to smiles and frowns.
A computer-generated three-dimensional environment that a user can freely explore, most often via a headset such as the Oculus Quest 2 or HTC Vive. Largely created through gaming software, VR transports the user to another location and often allows them to interact with it.
Augmented reality in television enhances productions by adding 3D computergenerated images, usually to the foreground of a virtual or physical set. It could be a scoreboard, Premier League football table, or an image of an actor or politician. AR can be used in any studio setting, not just a green screen studio. The CGI elements of AR don't interact with their environment but simply enrich it.
A powerful game engine, developed by Epic Games, that has established itself as a widely used 3D creation tool that delivers real-time photorealistic graphics. Initially developed for the games industry, it has since been adopted by many other industries. In the TV, film and advertising industries Unreal Engine is now widely used to build the 3D worlds used in virtual productions.
Reality Engine is the real-time node-based compositing platform based on Unreal Engine, developed by Zero Density. Combined with its proprietary keying technology and control tools, it enables the broadcast industry to control virtual elements of a production. Virtual elements can be designed natively as if inside Unreal Engine where they can be controlled, customised and automated from a single hub.
A UK-based company that manufactures advanced and innovative camera robotics and virtual production technologies. The Mo-Sys product range spans remote heads (cameras that can be controlled from a distance), motion control, broadcast robotics, and camera tracking for AR, VR and VP. Not to be confused with MoCap – see above.
A camera tracking system from Mo-Sys. The system looks at “stars” – reflective stickers applied to a studio ceiling – which allows StarTracker to report the position and orientation of the studio camera in real-time to the rendering engine. Multiple cameras in the studio can track off the star map. Once calibrated, the system is fully automatic. This is sometimes referred to as an ‘inside out’ tracking solution; the StarTracker camera is mounted inside the capture area looking for the reflective stickers that are outside the area.
CG (computer graphics) and CGI (computer graphics images) are two abbreviations used interchangeably in the TV industry which essentially mean the same thing: images that have been created with the aid of a computer. The images could be anything from an augmented reality scorecard through to a virtual set backdrop.
A new-ish role in broadcast production that describes someone whose job it is to design virtual sets. They will be familiar with Unreal Engine and Zero Density. Many VSDs tend to come from the games industry where they have learnt to create computer graphics on Unreal Engine and are now applying their skills to the broadcast sector.
Another new-ish role in the broadcast sector. A VSO sits in the studio's lighting gallery and is responsible for monitoring the quality of the computer-generated pictures during a live production on a virtual set. They will check the keying and masking, and work alongside the lighting director to make sure that any physical elements or presenting talent fit seamlessly alongside any virtual or augmented elements.
A real-time rendering technique used for creating realistic and dynamic lighting in realtime in virtual environments. Ray tracing simulates how light behaves in the real world to produce the realistic and immersive graphics.
A company that manufactures motion capture products. Xsens' wearable sensors register the movements of a performance artist, and the tracking data is then used to create digital characters. dock10 uses Xsens' mo-cap sensors to create the AR robot character ‘Clogs’ for BBC Bitesize Daily.
Xsens's motion capture software which exports all the real-time tracking data from its sensors to a users' 3D package. MVN Animate offers real-time 3D animation, graphs, data streaming and video.