• VP Land
  • Posts
  • Rendering's Next Frontier: Jules Urbach Maps AI's Transformative Path for Film Production at RenderCon

Rendering's Next Frontier: Jules Urbach Maps AI's Transformative Path for Film Production at RenderCon

GPU Revolution Reshapes Hollywood's Future as Jules Urbach and Ariel Emanuel reveal how AI, decentralized computing, and digital humans are transforming entertainment production at the inaugural RenderCon 2025.

The first-ever RenderCon conference kicked off yesterday at Nya Studios in Hollywood, bringing together leaders from across media, technology, and art to explore the future of rendering and creative production.

Jules Urbach, founder and CEO of OTOY and creator of the Render Network, delivered the opening keynote alongside media powerhouse Ari Emanuel, Executive Chairman of WME Group and CEO of TKO, reflecting on their 21-year partnership and sharing insights on where entertainment technology is headed.

RenderCon 2025 marks a significant milestone for the Render Network, a decentralized GPU computing platform that connects node operators with artists and developers to monetize idle GPU power for intensive 3D rendering and AI model training.

The one-day conference featured an impressive lineup of speakers including Richard Kerris from NVIDIA, Emad Mostaque of Stability AI, and digital artist Beeple, among other industry luminaries.

From GPU Pioneers to AI Revolutionaries: The 20-Year Journey

"We did it. We did it. I can't believe this is real," Urbach began his keynote, reflecting on the two-decade journey that led to this moment.

The presentation revealed how Urbach and Emanuel's partnership began in 2004 when GPU technology was in its infancy and NVIDIA was worth just $3.7 billion—roughly 1/1000th of its current valuation.

Emanuel recalled his initial encounter with Urbach, describing the scene from twenty-one years prior, where Urbach was demonstrating a visualization of a Star Trek ship navigating through a web. 

This early foresight regarding GPU-powered rendering has undergone significant advancement over the years. Urbach remarked that many industry experts were initially skeptical of their concepts, expressing doubts about feasibility. He recalled that individuals from early Pixar, along with others in the industry, were adamant that ray tracing could not be executed on a GPU. However, such capabilities have since been achieved.

Asset Ownership in the AI Era: Hollywood's New Frontier

One of the most compelling insights from the keynote centered on how studios and talent can benefit from digital asset ownership and reuse. Emanuel highlighted a vision they've been promoting for years: "If you use the LightStage, you should capture all these assets and then you could continually use them... It would bring down costs... You then could use it for the movie and then simultaneously build the game around it."

This approach allows assets to be deployed across multiple productions and platforms, potentially transforming Hollywood's cost structure. Urbach noted that through OTOY's LightStage technology, they've scanned numerous actors multiple times, including Dwayne "The Rock" Johnson and Robert Downey Jr., creating the foundation for what's now becoming essential in the AI era.

Everybody also realizes they don't know how to control their destiny, but they know that they have to. And especially for actors. Owning their image and being able to control their image and their estates controlling their image into the future, I think, is crucial.

Ari Emanuel, Executive Chairman of WME Group and CEO of TKO

Decentralized Computing: The Alternative to Big Tech Data Centers

Urbach presented a compelling case for decentralized GPU computing as an alternative to massive centralized data centers. The Render Network was designed to leverage consumer GPUs distributed globally, creating what Emanuel described as "your version of Starlink for GPUs."

This approach offers several advantages:

  • Lower-end GPUs can collectively handle tasks previously requiring high-end hardware

  • Distributed computing reduces reliance on centralized infrastructure

  • Recent advancements allow AI models to run effectively on consumer-grade hardware

  • The network can potentially build "$500 billion worth of data center" capacity without the centralized investment

Urbach explained how recent developments in AI are making this approach more viable: "The whole premise of what we're doing... is that, yeah, we can run on these lower-end GPUs also made by NVIDIA mostly, but also Apple, and distribute all this compute power."

The Future of Content Creation: Human-Centered AI

Despite the rapid advancement of AI, both Urbach and Emanuel emphasized that human creativity remains central to their vision. Urbach addressed concerns about AI replacing artists by highlighting how industry pioneers like John Carmack and Tim Sweeney view AI as a tool rather than a threat.

Referencing a recent AI-generated Quake demo that sparked controversy, Urbach noted:

You have a fan of Quake saying, this is absolutely disgusting. And Carmack replied saying, no, it's not. I created the game. I think this is amazing. It's a step forward. These are tools that we can all use to build something even more interesting.

Jules Urbach, Founder and CEO of OTOY

The keynote demonstrated this human-centered approach through the Unification project for the Star Trek archive, which used digital prosthetics and rendering technology to create new scenes with William Shatner as Captain Kirk. Rather than replacing actors, the technology enhanced human performances while preserving the artistic vision.

Source: Star Trek Unification

Beyond the Screen: Holographic Media and Spatial Computing

Looking toward the next frontier, Urbach presented his vision for holographic media experiences that extend beyond traditional screens. The Star Trek archive project demonstrated how high-fidelity 3D assets can be experienced in spatial computing environments like Apple Vision Pro.

This approach creates several new possibilities:

  • Immersive experiences that blend digital content with physical spaces

  • Light field rendering that creates digital holograms viewable from multiple angles

  • Interactive exploration of film sets and environments previously only seen on screen

  • Preservation of cultural and entertainment legacies in interactive formats

"If you want to experience what it would be like to see things in your space with the highest fidelity possible today, that's the device to do it on," Urbach said of the Vision Pro, while acknowledging the challenges of building the right pipeline for this emerging medium.

The Regulatory Horizon: IP Rights in the AI Age

The keynote also touched on the regulatory challenges facing the industry. Emanuel revealed ongoing discussions with White House officials about balancing technology innovation with intellectual property protection: "The White House is thinking about... how they balance between technology companies that are doing AI and Hollywood... and the royalty structure."

This regulatory framework will be crucial as AI continues to transform content creation, with Emanuel suggesting that a system is needed "by which everybody can benefit from it."

The Director's Cut: What Comes Next

As RenderCon 2025 continues, the conversations initiated by Urbach and Emanuel set the stage for deeper explorations of how rendering technology, AI, and decentralized computing will reshape entertainment production. Their 21-year journey from early GPU experiments to today's AI revolution offers a valuable perspective on navigating the rapid changes ahead.

For film production professionals, the key takeaway is clear: the tools and infrastructure for creating immersive, photorealistic content are becoming more accessible while simultaneously growing more powerful. The future belongs not to those who fear AI disruption, but to those who harness these technologies while keeping human creativity at the center of the process.

Reply

or to participate.