Zero Density showcased its full broadcast graphics stack at NAB 2026, with demonstrations built around Reality 5, the EVO III Pro hardware platform, and the Traxis tracking ecosystem.

The preview centered on three things broadcast teams will want to test on the show floor:

  • Three real-time rendering paths inside one system: Chaos Vantage for ray tracing, NVIDIA Gaussian splatting, and Unreal Engine 5

  • AI-assisted newsroom automation inside Reality Hub, aimed at MCR and on-air graphics operations

  • The Traxis Hub Automatic Lens Profiler and automatic 3D color profiling for XR video walls

The booth is organized around seven demo pods covering Broadcast Graphics, Newsroom & MCR Workflow, On-Air Graphics Operations, Reality 5 Green Screen, Camera & Talent Tracking, Real-Time AI Processing, and AI-Assisted Workflows.

Three Rendering Paths, One Pipeline

Chaos Vantage, Gaussian splatting, and Unreal 5 were all running inside Reality 5 at the booth.

The headline engineering story is Zero Density decoupling the render engine from the rest of the graphics pipeline. Operators can choose Chaos Vantage when a scene demands physically accurate ray-traced reflections and global illumination, switch to NVIDIA Gaussian splatting for photorealistic captured environments, or stick with Unreal Engine 5 for the game-engine-native workflows the industry already builds around.

That flexibility matters because broadcast graphics teams have had to pick one engine and build their production around it. Pulling three paths into a single template-based system lets the same set, graphics package, and tracked camera setup drive very different looks depending on the show.

We previously covered how Zero Density's unified template approach changed how broadcasters build and reuse graphics packages. The multi-engine story at NAB 2026 is the logical next step: one template, multiple render backends.

Reality 5.7 Under the Hood

The Reality 5.7 release introduced the architectural pieces that make the NAB demos possible:

  • Multi-channel execution architecture: independent graphics channels can be distributed across multiple Reality Engine instances, so a single operation can drive several on-air channels from one control surface

  • NVIDIA DLSS with Ray Reconstruction: real-time ray tracing at 4K output, upscaled from lower-resolution renders while holding 50/60fps

  • Yadif node for interlaced video: native support for interlaced sources inside the real-time pipeline

  • Hardware Monitor: now included with the standard SLA

Ray Reconstruction is what makes Chaos Vantage viable as a live broadcast engine. Ray tracing has been too expensive to run at broadcast frame rates, and DLSS-driven upscaling closes that gap by rendering at lower internal resolution and reconstructing the final 4K frame with AI.

AI-Assisted Newsroom Workflows

Reality Hub is where the automation lives.

The Newsroom & MCR Workflow and AI-Assisted Workflows pods are built around Reality Hub, Zero Density's control layer. The AI assistance targets the repetitive operational tasks that surround on-air graphics: building rundowns, triggering templates, managing graphics channels across shows.

The company is also demonstrating automatic 3D color profiling for XR video walls, which addresses one of the most time-consuming parts of LED volume setup. Color matching between the physical camera, the LED wall, and the virtual scene typically requires manual calibration for every shoot. Automating that step removes hours of pre-production work for any show that runs an XR stage.

Traxis Hub Automatic Lens Profiler

Lens calibration is the other time sink getting automated.

The Camera & Talent Tracking pod features the Traxis Hub Automatic Lens Profiler. Lens profiling, the process of characterizing a lens's distortion, field of view, and focus behavior so the tracking system can match virtual elements to the real camera, has been a manual, technician-led process.

Automating it means productions can swap lenses during a shoot without stopping for a long calibration pass. For news operations running multiple studios and frequent camera reconfigurations, that kind of automation is the difference between a virtual set being practical daily infrastructure and something reserved for tentpole shows.

What to Watch For on the Floor

The pattern across Zero Density's NAB 2026 showing is automation of the parts of virtual production that used to require specialist labor: lens profiling, color calibration, rendering engine selection, newsroom graphics operations. Our Real-Time AI Video newsletter traced a similar arc across the broader industry, and the Zero Density preview is a concrete example of those ideas landing in shipping broadcast tools.

For broadcasters evaluating LED volumes and virtual studios, the booth is built to demonstrate that ray tracing can run live at broadcast frame rates, that graphics teams can swap render engines without rebuilding template packages, and that XR stage and lens-change setup times have dropped enough to make virtual sets viable for daily shows.

Reply

Avatar

or to participate

Keep Reading