Production Details
Creative Triggers
When determining what projects are right for virtual production, the first step is breaking down the script. Targeting scenes for virtual production usually helps in narrowing down methodologies. Before we discuss those methodologies, let’s dive into the ways we target scenes, and what we can “Creative Triggers.”
Creative Triggers are elements identified from the script that make it clear a scene or location is a potential candidate for virtual production work. Vehicle process, interior set extension, and exterior set extension are some of the most easily detectable creative triggers. These represent a first attempt at targeting which portion of the creative content will benefit most from this process.
VEHICLE PROCESS
INTERIOR SET EXTENSION
EXTERIOR SET EXTENSION
Production Hurdles (And how to get over them)
While there are many advantages to virtual production, there are some things we need to look out for during production. We will discuss these below.
THE BIG ONE - MOIRE
When two intricate patterns cross across at an angle, the picture artifact known as moire appears. Digital camera image sensors feature extremely finely spaced pixels; when shooting or filming an LED display, the camera's fine patterns will not line up with the pattern of the LEDs, resulting in moire.
Here we can clearly see the pixel structure of the LED board clashing with the pixel structure of the image. Having stated the issue, we will examine the current options to combat this issue, both in terms of technological and procedural changes.
SOLUTIONS
COME ON, FOCUS!
As this paper has stated, one challenge with virtual production is the difficulty to bring the LED panels into clear focus. The standard industry practice is to always place talent far enough away from LED panels so that they can maintain a crisp focus on the subject while allowing for a softer focus on the LED panels to reduce moire.
Focus and camera movement combine to create a precise and deliberate balance. The depth of field can be deeper when the camera is operated with controlled accuracy and at slower speeds. Using a dolly, Bolt, or Technocrane, for instance, to pan the camera can aid in this.
ANAMORPHIC OR BUST!
Virtual production is one department that we encourage the use of anamorphics. This is typically because these lenses have a softer look. When attempting to create the most instantly recognizable cinematic style, directors and cinematographers frequently choose anamorphic lenses. Their appearance is obvious, offering a broader field of view for background items and the surrounding environment while keeping a longer focal length for foreground objects, or subjects. With their capacity to "squeeze" light, they also offer an additional advantage in virtual production.
SHUTTER ANGLING
A helpful term for the relationship between shutter speed and frame rate is "shutter angle." This phrase refers to rotating shutters, which exposed each frame by spinning a disc with an angled opening and letting light in once per revolution. Up to a maximum angle of 360°, where the shutter speed may become as slow as the frame rate, the shutter speed becomes slower as the angle increases. On the other hand, by reducing the angle, the shutter speed can be set arbitrarily fast.
Although modern cameras may not always adjust shutter speed in this manner, the phrase "shutter angle" has endured as a straightforward and common method to describe how motion blur appears in video. A wider shutter angle would be selected if the subject to be blurred for a larger portion of the frame to frame displacement, and vice versa. When the camera is moving, changing the shutter speed may be able to reduce the moire that the LED panels produce. To find out if a straightforward shutter angle modification will assist remove undesirable moire, testing will be done on the day of production.
PASS ME THE - OPTICAL LOW PASS FILTER (OLPF)
Another option is using an OLPF, sometimes referred to as an anti-aliasing or moire filter. Many digital cameras come with a built-in filter that is placed in front of the image sensor. The inclusion of an OLPF with all of their modern digital cameras has become standard procedure for the more renowned movie camera manufacturers like Arri and RED.
The OLPF objectives, however, aren't always achieved, depending on the camera.
For instance, because the OLPF is designed for adjacent pixels, cameras that take images via pixel skipping are far more prone to aliasing. This method is frequently used with smartphones and DSLR cameras that include an option for low-resolution video, in part because these devices have less memory and processing power.
SHINE BRIGHT LIKE A PANEL! (LED BRIGHTNESS)
It’s rare that this method is used to combat moire, however in some cases changing the brightness of the LED panels can help mitigate the issue.
WE GOT BARS! (MULTIPLEXING - AKA SCAN LINE ISSUE)
The scan line issue is another problem that frequently arises. When tilting the camera up or down, this can be noticed the most frequently. We must first learn how to calculate multiplexing in order to comprehend how to circumvent this problem.
The multiplexing rate is the number of full panel refreshes per second. Instead of updating the entire display at once, LED panels only illuminate small groups of diodes at a time, firing on and off too quickly for the human eye to see. The scan rate is the number groups (comprised of rows of LEDs) that are being refreshed each second.
The panel refresh rate, scan rate, and frame rate that production is shooting at are the three crucial factors that must be known in order to compute the multiplexing of an LED panel.
For instance, the older ROE Black Pearl 2 panel was used season one of The Mandalorian was used. The screen in question had a 1920 Hz refresh rate, a scan rate of 11, and production shot at 24 fps. Using the following method, we can now calculate multiplexing to determine how many times the panel refreshes in its entirety every second.
FORMULA: (REFRESH RATE / SCAN RATE) / FRAME RATE = MULTIPLEXING
OLDER BP2 = 1920 / 11 = 174.545 / 24 = 7.272
NEWER BP2 = 3840 / 11 = 349.09 / 24 = 15.545
This means that while the camera is cycling frames at a rate of 24/second, the panel is only refreshing at barely over 7/second. Genlocking frequently resolves this problem, but that is not always the case. To enable the LED panels receiving card to fire in a different pattern under certain conditions, firmware updates may be necessary.
In subsequent stages, it is necessary to utilize newer panel technology. There is a noticeable difference between the Black Pearl 2 Version 2 (BP2V2) or the ICON panel and the older BP2. Both of these newer screens have a 7680 Hz refresh rate and a scan rate of 8. We can now use the same method since we already know that production will use the same 24 fps.
BP2v2/ICON = 7680 / 8 = 960 / 24 = 40
As seen, the entire panel is currently refreshing at a rate of 40 times per second. This creates a very slim possibility of detecting the scan lines because panel refresh rate is much faster than the camera’s frame rate.
DISCOVERING SCAN RATE
This process is fairly straight forward. To calculate how many rows of LEDs that refresh at once, we need to know the pixel count and scan rate (how many groups the panel’s refresh is broken into).
The BP2V2 which is a 2.84p has 176 pixels tall and 176 pixels wide with a scan of 8 looks like this:
176 x176 = 30,976
30,976 / 8 = 3872
3872 / 176 = 22
22 / 2 = 11 rows of pixels per group
The ICON is a 2.6p and has 192 pixels tall and 192 pixels wide.
192 x 192 = 36,864
36,864 / 8 = 4,608
4,608 / 192 = 24
24 / 2 = 12 rows of pixels per group
WHO DIED? (DEAD OR BURNED OUT PIXELS)
During production, there will be times when pixels will burn out or stop functioning. This is fairly common and also very easy to fix. In the event that this happens, the module should be removed and replaced with a new one. Repairs to the dead pixel can either be on site by an LED technician skilled in re-soldering LEDs, or the module can be sent out for repair. As shipping modules for off-site repair should be done in batches rather than individually, it is important to have a sufficient supply of additional modules if your show is choosing this method.
Production Metrics
Productions utilizing an LED Volume like the one at Stage 15 will find many efficiencies:
Shoot Days: Production could consolidate total days of principle photography due to the following:
- Less travel days to different locations
- Less company moves
- Less practical lights set up time
- Less practical set builds as well as set up time due to smaller sets
- More thorough location scouting due to it being virtual
Crew: Head count and workday will change in the following ways:
- The ASVP crew (AKA Control Team), consisting of about 8 members. This team will run alongside and be in constant communication with the traditional film crew.
- Certain members of the Control Team will require 1hr pre-call daily. This is needed to set up the stage, load up the systems and warm up the panels.
- Many crews scale down when shooting on the volume. The biggest departments seeing these reductions are Grip and Lighting. This is due to fewer practical lights required.
- Typically a smaller on set VFX crew is required. Due to the ICVFX methodology.
Carbon Footprint: Virtual Production has significant upsides in reducing the carbon footprint of a particular show. Here are a few of the statistics: