VP Terminology
Important Terms
Virtual Production Volume (aka “The Volume”) comes from the term for an area of light. The light volume. This is a typically housed in a sound stage and is encompassed by LED panels/processors and a camera tracking system all driven by clusters of super computers to display photorealistic CG environments for filmmakers. The volume is the latest innovation in filmmaking that allows filmmakers to shoot their projects against any imaginable environments without leaving the comfort of a sound stage or the tedious and daunting tasks of removing green screen in post production.
Rendering of the volume at Stage 15
Virtual Art Department (VAD)
The VAD is a department that works directly with the Production Designer to create environments, sets pieces, props and vehicles for the production. The VAD team builds these sets the same way as a traditional art department would, using real-world scale, precise measurements and detailing, the difference being that they build them inside a game engine rather than physically. The VAD works closely with the traditional art department because the real and the virtual sets will have to live and blend harmoniously with each other on the stage.
In Camera Visual Effects (ICVFX)
ICVFX is a term that covers the techniques used to capture virtual content directly into the camera without the use of green screen or post-production compositing. The term covers a range of methodologies, such as:
3D ICVFX - This typically refers to game engine content (assets and environments built for Unreal Engine) being rendered on a display (typically an LED video wall) in real-time.
2.5D ICVFX - This term refers to altered 2D playback that has been digitally altered and given the illusion of being 3D with parallax.
2D ICVFX - This refers to video or image playback. Usually has no illusion of parallax. Driving (Traveling) shots are most often archived in this manner.
Frustum
A frustum is the area of view of a camera. In ICVFX terms, the Inner Frustum is what the production camera sees of the digital content. The Outer Frustum is everything outside of the camera’s view. For consistent, realistic illumination on an LED volume, the outer frustum keeps the rest of the environment static while the inner frustum moves in time with the camera. To account for latency concerns between camera movement and real-time rendering, there is typically a buffer zone outside of the inner frustum.
Camera tracking
The process of tracking the camera’s position in physical space in order to coordinate its movements with content displayed by the real-time engine.
Parallax
The perceived shift in an object's location when seen from various angles. By maintaining one focal point and traveling in a straight line, this is simple to see. As the subject moves, each background item will move independently of the others and change position.
Now that we’ve defined virtual production and supporting terms we can get into steps that can aid both producers and filmmakers in creating a successful show. The first thing to discuss is the five tenets of virtual production. These are guiding principles to keep in mind throughout the pre-production and production phases of the filmmaking process.
VP Glossary
Active Marker: A physical reference for tracking purposes with a unique i.d. such as a strobing LED bulb.
Cabinet: An LED panel's cabinet serves as its structural framework. Typically, this frame is made of steel, carbon fiber, or aluminum.
Cluster: A cluster of computers grouped together to work as one to display the pixels on the Volume. ASVP has 2 clusters to minimize production downtime.
Control Box: Main computer that our Control Team UE Engineer uses to control Unreal Engine.
Control Team: The ASVP crew in charge of the Volume operation. This includes engineers (UE, LED, Video, Mocap), artists, Virtual Production Supervisors and production staff.
Data Card: This circuit board works with the sending card to transmit data to the LEDs. Multiplexing and frequencies are determined by these two factors. Both have a significant effect on how a cinema camera sees LED walls.
DCC: Digital Content Creation software. Software to create 3D assets such as Maya, Houdini and Blender.
Engine Asset: A full three-dimensional digital object created in a game engine. This might be a single asset, such as a prop, or an entire virtual world.
Frustum: A virtual rectangular window in which the physical camera looks into the virtual world. Within this window will be the interactive virtual environment with proper parallax in relation to the physical camera. Sometimes can be referred to as ‘Inner’ and ‘Outer’ Frustum. This refers to the area inside or outside the frustum window.
ICVFX: In Camera Visual Effects. The technique used to capture virtual environments directly into the camera without having to use green screen and compositing in vfx.
LED Module: A light-emitting diode (LED) module is a compact circuit that combines a number of LEDs with the other electronic components required by the panel to generate images and videos. A series of magnets holds the modules to the frame, which is also known as the cabinet. These magnets enable quick maintenance and replacement during production.
On the module, there are two multi-pin data input ports. Atop and below, respectively. This makes the module global, but when a model is attached, the arrows that are visible on the rear always point in the same direction. (Typically up)
LED Panel: An LED panel is a display made up of several components that uses light emitting diodes to create an image. It is comprised of a cabinet, modules, light emitting diodes, LED masks, bus/data cards, receiving cards, and drivers.
LED Processor: LED video processor is also known as picture processor, image converter, video controller, image processor screen converter, video format converter independent video source.LED video processors are specially designed for the LED display. It is high-performance image processing and control devices for full-colour LED displays. Generally, it can change the resolution format and colour space, as well as image scaling; LED video processor integrates video image processing and high-definition signal processing technology. Designing combined with the special requirements of the full-colour LED screen display. It can simultaneously receive and process a variety of different video graphics signals and show on full-colour LED display screens. The main functions of the video processor are:
1.Image zooming. The LED video processor can scale the image and output the screen at any size. Whether the resolution is increased or reduced, the full image can be displayed.
2.Image quality improvement. High quality LED video processor can use advanced algorithms of image quality to enhance the detail of the image, improve image quality.
3.Signal switching. The management of various signals at a time when there are multiple signal connections, and flexible switch between signals.
4.Multi-screen processing. In many special situations, an LED screen with multi-screen processing functions can be flexibly deliver with such display requirements
Lidar/Laser Scans: Technique utilizing LiDAR (Light Detection and Ranging) or Laser to scan practical sets or props to turn them into 3D models to be used in the Volume.
Light Baking: Light Baking is a methodology used in Unreal Engine to bake the physically accurate lighting into all the assets in the environments. This technique improves the frame rate of the digital environment, as the software no longer needs to calculate the light rays in real-time. This will need to be done if a major modification is requested to be made to the Virtual Environments on the day. This can take between 15 to 30 minutes.
Mine: A device that sits on top of the production camera to track its positional and rotational values in real time. This data is then streamed to the virtual camera to mimic the movement of the physical camera. (Also sometimes referred to as: Sputnik or Crown)
Motive: The motion capture software used to track the camera’s movement.
nDisplay: The technology inside UE which renders out the virtual environment and sends it to the Volume.
OCIO: Open Color IO. A complete color management solution geared towards motion picture production with an emphasis on visual effects and computer animation.
Parallax: The perceived shift in an object's location when seen from various angles.
Perforce: The software that is used as a repository and version control system for virtual assets and projects. It can be configured to run in the cloud and on location simultaneously to allow collaboration with vendors worldwide.
Photogrammetry: The technique used to combine hundreds of photographs of an object or location and turn them into a 3D asset to be used in a Virtual Environment.
Plate Asset: A pre-recorded video created for playback on an LED wall. This can be either a single captured video, or a full 360-degree stitched video sphere.
Puck: Similar to the Mine but much smaller. It is attached to practical props on set to track positional and rotation data. The motion of these objects can then be replicated in the Virtual Environments. Useful for small handheld props such as a flashlight, or for tracking the position and rotation of physical lighting elements.
Receiving Card: The control mechanism for the LED panel is the receiving card. Receiving cards assemble, decipher, and send data from the LED processor to an array of light-emitting diodes that produce the image. Each LED screen panel has an installation for a receiving card on the back. This data travels through the data card, often known as the "driver." Real-time processing necessitates quick transfers and synchronization between all of the screen's elements and the processor.
Simulcam: Real-time compositing of previs quality environments used during physical production to give the director an idea of what the show will look like after VFX is inserted.
Switchboard: Part of UE, Used to control all the computers in the cluster from the Control Box.
Take Recorder: Part of UE. Used to record the camera’s movement for each take so it can be used in VFX as necessary.
Unreal Engine (UE): A video game engine that allows for the real-time render of 3D assets to LED Walls and simulcam Solutions.
VAD: Virtual Art Department. Similar to a traditional art department but creates all the virtual content rather than the physical.
Virtual Camera: A system that uses motion tracking and a screen to create a ‘camera’ that can be used to scout locations or plan camera moves. Often built with an iPad but can also be a larger shoulder rig or tripod mounted system.
Volume Control Team: The team of artists and engineers operating the equipment that drives a smart stage or any space used for virtual production. Areas of responsibility include content distribution, image manipulation, camera tracking, recording, and creative visualization of data. Also known as volume operations and mission control.
VR Scouting: A system to scout a digital location or environment immersively using a virtual reality headset.