• VP Land
  • Posts
  • Inside the Virtual Production of Crossfire: Sierra Squad with ARwall

Inside the Virtual Production of Crossfire: Sierra Squad with ARwall

🌐 3D monitor with no glasses, the future of XR DR design, and more!

Powered by

Welcome to VP Land - your daily update for all things virtual production & AI in video.

In this issue:

🎮 Filming inside a video game with ARwall and Soapbox
🖥️ Acer’s new 3D monitor
🧑🏻 Make ChatGPT more personalized

And more! Let’s dive in.

PRINCIPAL PHOTOGRAPHY

Behind the Scene with Crossfire: Sierra Squad

Crossfire: Sierra Squad is a new game developed by Smilegate Entertainment for the Sony PlayStation VR2 headset.

To capture the unique aspect of the virtual reality gameplay, the creative agency behind the project (Ayzenberg/Space.camp) turned to virtual production to create a spot that blended live actors playing inside the in-game world.

Filmed at Soapbox Films in Burbank, California, the trailer was shot against a 30x12ft 1.5mm 6K Absen AX Pro LED video wall powered by ARwall's own ARFX Pro Plugin for Unreal Engine, ARFX Pro Server System (check out our interview with ARwall at NAB), and the HTC Vive for camera tracking.

A few production highlights:

  • Project timeline was just a few weeks. Production was only a handful of days.

  • Location scouting was done virtually in the game's worlds

  • Director Jonny Zeller's first experience with Virtual Production

  • The team used practical effects including synced lighting systems, a treadmill, and a turntable, along with dirt explosions and fog.

We got a chance to talk to director Jonny Zeller, DP David Klassen, and ARwall's Rene Amador about their experience working with VP to create the spot (below is an excerpt, you can read the full interview here).

What was the prep process like and how was it different from how you tackle a green screen or on-location shoot?

Zeller: One of my biggest pet peeves about green/blue screen is that wrapping on set is only half the game. There’s so much post-work to actually finish the job and that can be terrifying. I love working with VFX so virtual production gives me the best of both worlds. I get to utilize the effects I want without having the unknowns of dealing with it down the line. I also feel like I get better performances when the actors can “feel” the world they’re working in.

Knowing how the talent was going to live in the space helped me design the blocking and camera positions. Then I thought about how to push it further and make the whole piece feel “bigger.”

We used every inch of our 30’ LED wall. ARwall was great about listening to my vision and then bringing their own sense of artistry to help the piece come to life.

Source: ARwall

Klassen: With greenscreen shoots, many creative choices are deferred to the editing phase, meaning certain elements may remain undecided until post-production.

In contrast, working with LED screens necessitates meticulous pre-production planning, leaving no significant aspects open to interpretation on the day of shooting. This shift towards enhanced collaboration fosters the development of highly innovative and imaginative solutions. The increased involvement of the entire team, including the cinematographer, propels the creative process forward, leading to a better final product.

What was the virtual scouting process like inside the game?

Zeller: I've always enjoyed physically scouting locations for their energy and real-time visualization. But virtual scouting offered a new way to explore the diverse settings in the game. I absolutely loved it. I could “fly” around the massive levels from the game and pick the locations with the right aesthetic. Seeing the digital world like that helped shape my shot list and opened up my mind to new ideas that I would not have thought about had I not been able to see them prior to shooting.

During the tech rehearsal day, I could work with DP, David Klassen to pick the specific frame, lensing, etc.

We moved the sun around, turned on virtual lights, knocked out walls, and plugged in a few explosions. Now I’m hooked and don’t want to do it any other way.

Did you hit any roadblocks or limitations with VP and how did you work around or overcome it?

Klassen: One prominent challenge we faced was the considerable restriction on camera movement imposed by the wall. Excessive panning or tilting could potentially reveal the edges of the LED wall, disrupting the illusion of the virtual environment. This restriction limited our ability to execute sweeping or rotating shots practically.

To address this issue, we collaborated closely with our skilled Unreal technicians and were able to integrate the camera movements directly into the digital environments. We then complemented those movements with real world lighting changes, such as sweeping them around or dimming them up and down. This approach enabled us to achieve the desired effect, allowing our talent to rotate indefinitely within the virtual space.

This is just an excerpt from our interview. Read the full interview on our website:

Sponsor Message

Safeguard your video content with Vestigit.

Vestigit’s innovative solutions, including Vestigit AI, Vestigit Sentinel AI, and Vestigit Watermark, provide robust protection against video content piracy and track illegal streams.

Don't let your valuable content fall into the wrong hands. With Vestigit, you can ensure your content remains secure and in control.

Experience the power of Vestigit’s platform firsthand - contact Vestigit now for a free demo.

SECOND UNIT

Acer's new 3D monitor (no glasses required)

Source: Acer

Acer's SpatialLabs View Pro monitor, part of the SpatialLabs View Series, redefines visual technology with stereo 3D experiences without the need for glasses or headsets. Its unique features, such as sensory solution eye-tracking technology and AI-accelerated content conversion, provide users with a new level of immersion and versatility.

  • The monitor delivers enhanced image quality and color precision, making it an ideal tool for professionals in fields such as graphic design, photography, and video production.

  • A set of stereo cameras and dual image sensors track the viewer's eye and head movement, allowing the display to adapt to their gaze and perspective.

  • The View Pro supports gaming experiences with a significant sense of depth, though some games' 3D up-conversion for compatibility can occasionally disrupt clarity.

  • An optical lens bonded to the monitor and sophisticated eye-tracking technology generates a stereoscopic 3D display that adjusts to the viewer's eye movements.

  • The monitor's AI-accelerated feature automatically converts 2D content into real-time stereoscopic 3D, bringing new life to existing media.

  • The SpatialLabs View Pro can effortlessly switch between 2D and stereoscopic 3D modes, providing flexibility for the user's viewing preferences.

While it offers many advantages, the SpatialLabs View Pro does have a few drawbacks, including limited availability of native stereoscopic 3D content, a high price tag due to its advanced technology, and potential eye strain from the immersive experience.

Regardless, it remains a unique offering in the visual technology market for both professional and personal use.

AKS

* Sponsored Link

ABBY SINGER

Thanks for reading VP Land!

Have a link to share or a story idea? Send it here.

Interested in reaching media industry professionals? Advertise with us.

What did you think of this email?

Login or Subscribe to participate in polls.

Reply

or to participate.