top of page

My First Mixed Reality Shoot - Recorded Live at ARRI Studio London 7 of July 2021.

Eric Hasso's reaction.

The discussion talks about using LED as the screen technology for mixed reality shoot.

Eric Hasso who is an expert in using projectors for photo real mixed reality shoot want to bring that technology into the discussion.

 

Listen to Join Greig Fraser ASC, ACS, Martin Ruhe ASC; Magdalena Gorka PSC, DNEG’s Paul Franklin and Robert Payton in this must-watch discussion about their early experiences shooting in mixed reality studios and Eric Hasso, – Projection FX Producer’s, response to the panel discussion.

 

In italics are quotes from the panel discussion.

In bold are Eric’s responses to the quotes.

 

“None of the LEDs that filmmakers are using are being made for filmmakers.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1020

 

Greig Fraser ASC, ACS, Cinematographer – The Mandalorian

 

  • As Greig is pointing out, the LEDs today are not being built for filmmakers. What that means, is the color rendition, the light output, the contrast, and the viewing angles are limiting the creative process. 

  • The Barco 3-chip projectors we work with at Igelkott Studios are perfect for filmmakers, because starting from the digital revolution projectors have been pushed to recreate images captured and delivered by filmmakers at the highest quality possible. 

  • Now 2021 we are able to display and capture high quality imagery in camera when we project the world behind the actors and the set.

 

“The path needs to stay exactly the same as it has been for the last 100 years. There has to be a script, there has to be a discussion with the designer, the post supervisor, the director and the DP about how we are going to shoot something. Are we going to shoot in a volume? Is the design conducive to shooting in a volume?”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1127 

 

Greig Fraser ASC, ACS, Cinematographer – The Mandalorian

 

  • Agreed.

“Screens have moiré, if you focus on them they have moiré.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1221

 

Greig Fraser ASC, ACS, Cinematographer – The Mandalorian

 

  • First, moiré is an artifact that destroys the image you are capturing and no, that is only true for techniques like LEDs. With a projection we achieve an image with no separated pixels because the pixels move in the system we use. They move fast to resolve the image at 3840x2160 giving the DoP freedom to put the focus on the screen/wall.

 

Projected image of Manhattan. Zoomed in, you start to resolve the structure of the wall before you resolve any pixels.

20211207_094916.jpg
20211207_094938.jpg

“Do you have any valuable lessons about synching the camera with the wall?”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1394

 

Zoe Mutter, editor of British Cinematographer magazine

 

  • It’s very important to sync, if you don’t sync you will experience ghosting and/or tearing artifacts.

  • The computer’s video/graphics card and camera need to be in sync, and only Arri mini LF has a sync-in port (Sensor Genlock, as of Dec 13, 2021) that can offset the delay in the signal chain +/- a frame (20ms). 

  • The offset capability is important because the latency of the projector or camera may cause the shutter pulse timing to be wrong even though they both queue from the same generator. 

 

“It takes a little while for the computer to figure out where the camera is in the room, what the image looks like and where to place it. This induces a delay of 6–7 frames.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1565 

 

Paul Franklin, Creative Director of DNEG, VFX Supervisor and Director.

 

  • There are three stages that cause delay, tracking, rendering and displaying.

  • Tracking systems are fast IR cameras. Complex setup and network speed add delay. Let's say it adds one frame of delay.

  • Rendering. To be able to render frames without drops the computer has a time budget: for 24 fps the time budget is 41.6 ms. Rendering adds 1 frame delay.

  • Displaying. To display one whole image and not a broken image the media server, the PWM and the LED screen need to work in sync to create one cohesive picture. To be able to do this, everything is genlocked and every step that receives the genlock signal has a frame buffer. Every frame buffer adds one frame of delay. 

  • The LED we have in our studio, which we only use as a light source, has 2–3 frames of delay. Adding the complexity of a huge screen with multiple process points, you will add 3-5 frames of delay after the image is rendered in Unreal Engine.

  • What can we do to reduce the latency? Change the display technology! Projectors can cover the entire wall, they don’t need frame buffers. The image is displayed on the wall, a few milliseconds after the image has been received, adding no noticeable latency after the image is sent from the computer. This allows the camera operator to work naturally with the creative tools with less than two frames latency.


“If it doesn’t work it doesn’t look good, and if it doesn’t look good it will give Virtual Production a bad name. If it’s a big enough event, it could be negative for Virtual Production’s reputation.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=1686 

 

Greig Fraser ASC, ACS, Cinematographer – The Mandalorian

 

  • 100% agreed. Digital Virtual Production is new, there are few standards and it needs good publicity to continue to evolve.

 

“To run the LED at a very low level when we shoot night scenes it gets a bit complicated. The LED wall projects light, and if you run it at very low percentages you might get weird effects.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=2101 

 

Martin Ruhe ASC, Cinematographer, Midnight Sky.

 

  • Martin is touching on an aspect of LED technology that is very technical.

  • LED projects red, green and blue light from diodes. To achieve the full color range of a 16 bpp LED, you need to use the full latitude of the brightness of each pixel. The latitude is 32 nuances of red, 64 of green and 32 of blue.

  • When you use the entire latitude of brightness you can achieve 65,536 nuances of color in each pixel group. But if you want to cut 50% of the brightness because it is too bright, you only have 8,192.

  • Now 50% brightness is still too much to project shadows, and when the shadows go below 25% brightness you have max 1,024 colors left to render the scene.

  • That causes weird effects because “True color” is considered to start at 16,777,216 colors.

  • With projectors it's another story, the projectors we use do not kill the colors when they lower the brightness, because they have additional contrast elements making night scenes possible. 

  • Also the internal color processors in the Barcos work with 68 billion colors (12 bpc).

 

About LED walls: “When you’re lighting the set, you get a lot of contamination and it tends to milk them out and you lose contrast.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=2944 

 

Paul Franklin, Creative Director of DNEG, VFX Supervisor and Director.

 

  • What Paul is talking about is also the issue with projected backgrounds. LED manufacturers try to work with different materials to negate the problem of light bouncing off them. 

  • At Igelkott Studios, we work with special high contrast paint that makes the shadows and dark areas in the picture lower than the ambient light, rather than hovering high above the threshold which would be considered too bright to be believable. That and working closely with the electrical department gives us great results.

 

“You need to consider that building an environment (3D) takes 11–20 weeks depending on the asset you are building. It’s a lot of steps and it takes about 5 weeks for us to light it.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=3077 

 

Magdalena Gorka PSC, Cinematographer, Star Trek Strange New World

 

  • Creating 3D environments takes a lot of time.

  • At Igelkott Studios we are specialists in photo-based backgrounds, and our workflow takes two weeks from shooting the location to going into the studio and using it.

 

“All of the camera manufacturers and lens manufacturers are accepting that this is a technology that is here to stay.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=3724 

 

Robert Payton, Director and Cinematographer

 

  • It’s important to learn how to make the best of it, and ensuring that the manufacturers believe in Virtual Production is key.

 

“Shooting slow motion is not there yet because the traditional shutter writes don’t line up with the wall.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=3913 

 

Magdalena Gorka PSC, Cinematographer, Star Trek Strange New World

 

  • About slow motion:
    We did a successful test this summer (2021) where we shot 120 fps on a background that played at 120 fps. So it works.

  • What you need to do to make it work is – a lot.

  • The heart of the problem is that the hertz of the screen and the camera are not what’s being shown in the menu. You need to figure out the native Hz of the display and camera, and only use the native Hz and test equipment with each other until you find a match. 

  • The next problem is drifting as existing genlock systems do not do 120 fps. They will max out at 30 fps or 60i, and at 120 fps the camera and screen will fall in and out of sync fast.


 

“Once we have panels that are really good off axis, suddenly our limitations about putting actors closer to the wall are less limited so we can make volumes that are NOT curvature shaped. The only reason they’re curved is because everywhere you need to look with the camera needs to be as front-on to the panels as possible.”

Link with time - https://youtu.be/-T8B2LcMLNo?t=4294 

 

Greig Fraser ASC, ACS, Cinematographer – The Mandalorian

 

  • Don't use panels with diodes if you want a great viewing angle.
    The diodes are physical and viewing them at an angle will block the light from the furthest away diode causing a color shift.

  • With a projected image the light is evenly scattered in all directions.

  • When we build the projection walls, we use plaster and special projection paint. This paint creates better contrast, the light and the color are spread in all directions, creating a huge viewing angle while also reducing any hotspot that can stem from using projectors.

 

/ Eric Hasso - Projection FX Producer

bottom of page