At Amazon MGM Studios' Stage 15 in Culver City—home to North America's largest virtual production LED wall—director Jon Erwin stood before an audience of filmmakers and AI startup founders on October 13, 2025, and issued a challenge: "I dare you to figure out which shots are AI."

The occasion was the Culver Cup, AWS's Gen AI Film Showcase, where Erwin and Chris del Conte, Amazon MGM Studios' global head of VFX, revealed production details that illustrated how rapidly AI integration has scaled in professional filmmaking. When House of David Season 2 premiered on Amazon's Wonder Project streaming service in October 2025, audiences witnessed 100,000 warriors charging into battle, horses galloping directly at the camera, and an entire episode taking place in an epic conflict.

What they couldn't easily detect: 253 of those shots were generated using AI—a massive jump from Season 1's 73 AI-generated shots, achieved in less than a year as the technology and workflows matured.

The production developed what Erwin calls a "hybrid" approach that seamlessly blends AI-generated content with live-action photography and traditional VFX, making individual techniques nearly impossible to distinguish.

Building Fidelity Through The Hybrid Workflow

The jump from 73 shots to 253 wasn't just about using more AI—it required solving fundamental production challenges that prevented AI-generated content from holding up at broadcast quality.

Season one, we were kind of swimming. You guys were coming up with solutions for shots that might never have been attempted traditionally. Season two, you actually planned for Gen AI to be the workflow.

Chris del Conte, Global Head of VFX, Amazon MGM Studios

Del Conte collaborated with Erwin's team starting in January 2025, bringing together his experience building Amazon's AWS-powered virtual production infrastructure with Erwin's vision for AI-integrated storytelling.

That planning included developing what Erwin describes as three critical components: education, process, and implementation. The team trained daily on generative tools, built repeatable workflows around them, and integrated them as an "acceleration layer" within commercial production schedules.

The technical breakthrough came through style transfers—applying the show's visual aesthetic and shooting style directly onto AI-generated assets. This allowed the production to match AI content to live-action photography so closely that, as del Conte noted, "the stability in the workflow" made AI a planned production technique rather than a backup solution.

The Battle That Proved The Concept

Season 2's opening episode provided the stress test. The entire episode takes place during a massive battle featuring armies of 100,000 on each side—a scope that would typically require prohibitive budgets or significant compromises.

During the Culver Cup screening, Erwin narrated shot-by-shot which elements were AI-generated:

  • Close-ups of horses charging past the lens - AI, based on show assets but too dangerous to film practically

  • Wide battle shots - Blending practical extras with AI-generated crowds

  • Smoke and fire effects - Notoriously difficult for AI but improving rapidly

Erwin called out elements as shots flashed across Stage 15's massive screen, demonstrating how thoroughly integrated the techniques had become. Many shots layered traditional VFX on top of AI-generated bases, or vice versa, creating a fusion that played to each technique's strengths.

The production also tackled virtual production asset creation—generating over 150 shots worth of environments for LED wall backgrounds. According to del Conte, traditional Unreal Engine environment builds typically require 10-12 weeks and cost between $15,000 to $200,000. The team discovered they could build structural bones in Unreal within a week, then use AI to add photorealism through style transfers, dramatically compressing both timeline and budget.

The normal blockers—the cost and the time it takes to build an environment—are going to fall away very quickly.

Chris del Conte

From Cave Paintings To Cinema-Ready

The trajectory Erwin demonstrated at the Culver Cup illustrated how rapidly the technology evolved. In June 2024, early test generations looked rough—useful perhaps for distant background elements on LED walls, but nowhere near final quality. By September 2024, the team began successfully upscaling AI content to 4K HDR for Season 1's Goliath origin sequence, a mythological backstory that played to AI's then-limitations with slow-motion, ethereal imagery.

Season 1 used AI selectively for 73 shots, primarily in sequences where the technology's characteristics fit the creative intent. Season 2 planned AI integration from the start, allowing the production to generate what Erwin calls "new principal photography"—a digital second unit running parallel to live-action shooting.

For every VFX shot in the show, we're generating 20 times that.

Jon Erwin

The production generates batches of AI content and gives it to editorial to sift through, similar to traditional footage. Only shots that make the cut get upscaled to final quality and integrated with additional VFX work as needed.

Tools, Training, And The Daily Grind

Erwin's path to AI integration began with curiosity about tools his production designer was using during Season 1 filming in Greece. Within 30 minutes of sitting down to learn the basics, he felt the same excitement as holding his first camera as a young filmmaker.

I called every lawyer at Amazon and just bludgeoned them until they said yes.

Jon Erwin, on getting approval for AI-generated shots

The legal and technical challenges required proving visual chain of title (similar to script copyrights) and developing reliable upscaling pipelines to 4K HDR.

The production stacks multiple AI tools together—Erwin mentioned Midjourney, Runway, Kling, Magnific, and Topaz in various contexts—combined with traditional VFX tools like Adobe After Effects and Unreal Engine. No single tool provides broadcast-ready results alone, but stacking them creatively achieves what Erwin calls "superpowers."

What I've learned is that the amount of time you put in matters. A lot of people wish to win, very few have a wish to prepare to win.

Jon Erwin, quoting Bear Bryant

The discoveries that went into Season 2 came from the team training together daily, treating AI like stunt performers who drill constantly before applying skills to productions.

The Democratization Promise

For Erwin, whose career began as a freelance camera operator at age 15 (he neglected to mention his age to ESPN at a University of Alabama football game), the parallels to digital camera democratization are obvious. The RED ONE camera and similar tools allowed filmmakers outside traditional production centers to compete at professional levels.

My last name's not Nolan or Favreau. But I can't wait to see [AI performance capture] being democratized to where emerging filmmakers can access scope and scale and their imagination is really the only limitation.

Jon Erwin

He envisions filmmakers shooting in rehearsal spaces with iPhones, piloting digital assets that provide cinema-scale production value—essentially making films before greenlight to de-risk productions and arrive on set fully prepared.

Del Conte shares the optimism about expanded access:

Shows that may have a limited budget, and these days every show has a limited budget, can now make that work and get more content on screen.

Chris del Conte

The production employed 600 people on "House of David" at a budget Erwin describes as "pretty night owl" compared to what others thought possible for the show's scope. AI didn't replace crew—it enabled the production to say yes to more ambitious storytelling.

The Bleeding Edge Continues

The Culver Cup showcase—moderated by Nikao Yang from AWS Startups—served as both a technical demonstration and a signal to the AI startup community in attendance. Following Erwin and del Conte's presentation, a second panel featuring founders from Luma AI, Krea, and Hedra explored what comes next for generative tools in production.

Both seasons of House of David are now streaming on Wonder Project, the faith-based and inspirational content service that launched October 5, 2025, on Amazon Prime Video for $8.99/month. The service was co-founded by Erwin and former YouTube and Netflix executive Kelly Merryman Hoogstraten.

Erwin acknowledges the road ahead remains uncertain:

Based off last year, I don't even know if there's any way to predict where the world's going to be in 18 to 24 months in our industry. But I can guarantee you it'll be exciting.

Jon Erwin

For filmmakers considering AI integration, his advice centers on replacing fear with curiosity: "Learn everything you can. I don't have time not to." And for Amazon MGM Studios, the willingness to let a production "bleed on the bleeding edge" provided the test case proving AI can integrate into commercial production at scale—253 shots at a time.

Reply

or to participate

Keep Reading

No posts found