• VP Land
  • Posts
  • Sony's Virtual Production Toolset 3.0 Introduces Corner Shooting for LED Volumes

Sony's Virtual Production Toolset 3.0 Introduces Corner Shooting for LED Volumes

Sony has unveiled version 3.0 of its Virtual Production Toolset at NAB, featuring enhanced color compensation that eliminates angle restrictions when shooting LED volumes and built-in moiré detection that could transform focus pulling workflows.

The company positions itself as "the only manufacturer with a complete LED solution" that integrates cameras, wall technology, and software tools into a cohesive ecosystem.

Camera-to-Wall Synergy: Sony's solution addresses one of virtual production's persistent challenges—shooting at sharp angles.

The system demonstrates impressive color compensation capabilities with the VENICE 2 camera and VERONA LED panels, allowing directors to shoot directly into 90-degree corners without visible color shifts or seams. This solves a significant limitation in virtual production setups, where off-angle color shifts typically restrict camera positioning.

  • The technology enables smaller virtual production stages while maintaining the illusion of larger environments

  • Sony's workflow compensates for color shifts in real-time that would normally be visible to the naked eye

  • Wall-agnostic color calibration now works with both Unreal Engine environments and 2D background plates

Focus on Metadata: Sony has developed a new metadata-based approach to solving common virtual production problems.

The system now records moire pattern detection directly into camera files, providing real-time feedback to focus pullers while also preserving this information for post-production. This addresses the common issue where moire patterns (interference patterns when camera pixels interact with LED wall pixels) often go unnoticed on small monitors during production.

  • Built-in moire detection provides visual alerts (green/yellow/red) to focus pullers in real-time

  • The VENICE 2 beta records moire metadata into X-OCN files for post-production reference

  • Sony's RAW viewer beta software visualizes moire occurrence throughout a clip

Virtual-Physical Bridging: The toolset creates unprecedented integration between physical and digital production elements.

Sony has implemented Live Sync which automatically updates virtual camera parameters in Unreal Engine when physical camera settings change. This eliminates the common workflow bottleneck of manually synchronizing camera changes between physical and virtual worlds.

  • Camera settings like aspect ratio automatically sync from physical to virtual cameras

  • PTZ cameras can now be GenLock synced and have virtual counterparts in Unreal

  • Integration with Sony's XYN AR/VR toolkit enables glassless 3D visualization for shot planning

Beyond the Frame: Sony's end-to-end approach points to a future where the boundaries between virtual and physical production continue to blur.

While Sony still works with Unreal Engine rather than developing its own media server, the company has created an integrated ecosystem spanning from asset creation to final output. The color science accuracy between real and virtual cameras is particularly noteworthy, representing what Sony claims is "the first time any manufacturer has done that."

  • The system translates accurate VENICE camera sensor color science to virtual cameras

  • 3D LUTs can be shared between physical and virtual cameras for consistent looks

  • Preview options now include both SDR and HDR visualization

  • Spatial reality display enables glassless 3D visualization for pre-production blocking

Reply

or to participate.