In this episode of Denoised, hosts Addy Ghani and Joey Daoud unpack three significant developments in the world of media technology. From Epic Games' latest advancements at Unreal Fest to Danny Boyle's innovative iPhone-based filming techniques for 28 Years Later, and Natasha Lyonne's clarification about her AI hybrid film project, this episode offers practical insights for creators navigating the rapidly evolving landscape of creative technology.
Unreal Fest Showcases Key Updates in Engine Capabilities
Unreal Fest, Epic Games' annual developer festival, delivered a substantial collection of updates that signal where the company is steering its vision for the future of interactive media. The event, which moved to Orlando this year, follows Epic's significant legal victory against Apple regarding app store fees—a win that has already begun reshaping how developers can monetize their applications.
This year's fest put significant emphasis on MetaHuman technology, which has been evolving since Epic acquired Three Lateral several years ago. The latest updates bring MetaHuman creation directly into the Unreal Engine interface, eliminating the previous cloud-based workflow that required downloading gigabyte-sized assets after creation.
"Now you can just build the characters in Unreal," explained Addy. "And on top of that, the character customization is a lot easier."
Key advancements for MetaHuman include:
Direct in-engine character creation (no more cloud download workflow)
Parametric permutations for more diverse body shapes
Updated outfit and clothing system with marketplace integration
Integration with the creator economy via the Fab marketplace
The hosts noted how Epic seems to be converging with Roblox's vision of user-generated content, but approaching it from a higher quality perspective. "The two companies are coming at it from two different angles but converging in the same goal, which is like a metaverse for all that's visually breathtaking," observed Addy.
Unreal Engine 5.6 Technical Updates
Unreal Engine 5.6 includes several technical improvements focused on animation tools and performance optimization:
Hardware-based ray tracing improvements to Lumen for better frame rates
Motion Trail animation tool for easier, more visual animation workflows
Sculpt morph targets for better facial animation and muscle deformation
Capture Manager for motion capture data that effectively replaces MotionBuilder
For virtual production professionals, these updates signal a continued expansion of Unreal's capabilities into territory traditionally occupied by specialized software like MotionBuilder and Maya. The Capture Manager feature allows motion capture data from systems like Vicon to flow directly into Unreal Engine without requiring intermediate software.
"5.6 is certainly a feature film tool, a AAA game tool—it's got all of that," noted Addy. "And it's free [for qualifying developers]."
These improvements create a more unified workflow for creators, reducing the need to switch between multiple software packages during production.
28 Years Later Takes iPhone Filmmaking to New Heights
Danny Boyle's upcoming sequel 28 Years Later is pushing the boundaries of mobile filmmaking, using iPhone 15s as the primary cameras. A recently released behind-the-scenes photo revealed a fascinating technical setup: a curved platform with approximately 20 iPhone rigs all pointed at a zombie actor in the woods.
Boyle describes this as a "Poor Man's Bullet Time" rig, referencing the famous effect from The Matrix that originally required complex setups with multiple film cameras firing in sequence. The iPhone approach offers significant advantages:
Digital recording allows synchronization across all cameras
Flexibility to choose the exact moment for the bullet time effect
Mobility that wouldn't be possible with traditional cinema cameras
Joey noted the historical context: "Going way back through The Matrix, that was shot on film. They had to get still cameras and trigger them in order and sync, and basically each camera was one frame."
While the iPhone's small sensor presents some technical limitations regarding depth of field, the hosts discussed how this approach creates unique creative opportunities. The compact form factor enables camera positions and movements that would be impractical with traditional cinema cameras.
"You can't mount 20 cameras on a handheld rig and have your grips run and follow a zombie and do a fake bullet time in the middle of the woods. You can only do this with an iPhone," Joey explained.
The production also reportedly uses the Panasonic EVA1 cinema camera for certain shots, continuing Boyle's tradition of embracing accessible technology after shooting the original 28 Days Later on the Canon XL1.
Natasha Lyonne Clarifies AI Approach for Directorial Debut
Natasha Lyonne responded to public criticism regarding her upcoming directorial debut, which was previously reported as an "AI hybrid film." In a Variety interview, she expressed frustration over misinterpretations: "It's comedic that people misunderstand headlines so readily because of our bizarre culture of not having reading comprehension."
Lyonne clarified that the AI elements in her film will be used primarily for set extensions and background enhancements—similar to how Darren Aronofsky has approached AI integration in his recent work.
The hosts put this approach in context with fully synthetic AI filmmaking projects like Echo Hunter, which they noted is "probably the best looking consistent AI-generated film" currently available. However, they emphasized that most audiences ultimately care more about story quality than production methods:
"I think at the end of the day, the common theme is like, is it a good story? Are people gonna get into it? Most—99% of people are really not gonna care how it was made," observed Joey.
Addy concurred: "When the average moviegoer goes to the theater, buys popcorn, sits down... it better be good. Are you entertained? Did I get my money's worth?"
Conclusion
This episode of Denoised highlights the continuing integration of accessible technology into professional media production. From Epic's tools democratizing high-end digital human creation to innovative uses of consumer devices for cinema, these developments are reshaping what's possible for creators at all levels.
The discussions reveal how the lines between gaming, filmmaking, and interactive media continue to blur, creating both challenges and opportunities for media professionals. Whether you're a virtual production specialist, an independent filmmaker, or a studio executive, staying informed about these technological shifts is increasingly essential.