In the latest episode of Denoised, hosts Addy Ghani and Joey Daoud dive into three significant developments affecting media and entertainment professionals. Adobe introduces a sophisticated computational photography app, major filmmakers return to traditional film for blockbuster productions, and advances in AR technology show promising steps toward practical applications. Here's our breakdown of the key discussions from this information-packed episode.
Adobe's Project Indigo Elevates Mobile Photography
Adobe has entered the computational photography space with Project Indigo, a new iPhone camera app that aims to bridge the gap between smartphone photography and professional camera systems. The app takes multiple photos simultaneously when triggered and combines them using computational processing and AI to create enhanced images with a more natural look.
Joey explains that Indigo addresses several limitations of smartphone photography:
It reduces the typical "smartphone camera look" through AI processing
It saves images as DNG files for advanced editing in Lightroom
It incorporates AI-powered denoising for improved low-light performance
It enhances zoom quality beyond what the physical camera lens supports
It includes a built-in "Remove Reflections" feature from Lightroom
Addy noted this represents Adobe's strategic expansion beyond their current Firefly generative AI tools into computational photography—an area where Google and Apple have been investing heavily. The hosts discussed how this approach leverages software to overcome the physical limitations of small smartphone sensors, which naturally have constraints in color performance, signal-to-noise ratio, and dynamic range.
The conversation touched on how this continues the broader trend of separating image acquisition from processing, similar to how Blackmagic's workflow separates camera operation from post-production enhancement in Resolve.
Key takeaways:
Project Indigo gives users more control and professional options in mobile photography
Adobe is positioning this as part of their creative ecosystem that connects to Lightroom
The app represents Adobe's response to changing market trends as more photographers rely on mobile devices
Film Makes a Comeback with Jurassic World
The hosts discussed Kodak's recent announcement that the upcoming Jurassic World film will be shot on traditional film stock. Kodak released a behind-the-scenes video featuring director Gareth Edwards and other production team members explaining their choice to use film rather than digital cameras.
The conversation highlighted several interesting points about this decision:
The production team cited the immediate visual quality of film dailies compared to digital footage that requires grading
VFX supervisors discussed the challenges of integrating digital elements with film plates, which have natural characteristics like chromatic aberration, vignetting, and film grain
The hosts noted the irony that Gareth Edwards previously shot The Creator on the consumer-level Sony FX3 camera, and is now working with high-end Panavision film cameras
Addy raised important questions about the practical challenges of shooting on film for a VFX-heavy production like Jurassic World:
The need to scan film stock into digital formats for VFX pipelines
Additional costs associated with film labs and scanning processes
The time considerations for these additional workflow steps
The hosts also discussed the broader trend of film's resurgence, noting that Leica is introducing a new 35mm film stock for still photography, and other companies are entering the film production space.
Key takeaways:
Major studios are willing to invest in film for tentpole franchises despite the additional costs
Film choice may be partly motivated by nostalgia for the original Jurassic Park (shot on film)
As Addy pointed out, "shooting on film is not going to save you from a bad movie"
AR Advances at AWE: Snap's Spectacles Lead the Way
Joey shared his first-hand experience at the Augmented World Expo (AWE) in Long Beach, where Snap's Spectacles AR glasses were prominently featured. Currently targeted at developers, Snap announced a consumer version coming next year.
Joey's hands-on experience revealed several impressive aspects of the technology:
The glasses are self-contained with no external cables, making them comfortable to wear
They can map surfaces and project digital content onto the real world
Hand tracking allows for intuitive interaction through pinches, points, and palm gestures
The tracking is responsive, with digital elements staying properly anchored in the physical world
However, Joey noted limitations like a narrower field of view compared to immersive VR headsets, requiring head movement to see content in peripheral vision. The demos he experienced were primarily games rather than productivity applications.
Addy provided technical context about the challenges facing AR hardware development:
Most AR glasses are powered by Qualcomm Snapdragon chips
Miniaturization remains a significant hurdle
Developers must balance weight constraints, battery limitations, and performance expectations
The conversation expanded to include RP1's "metaverse browser" for the spatial internet, which aims to map the physical world one-to-one with a virtual metaverse. This could enable AR applications like in-store navigation or finding products in retail environments when paired with AR glasses.
Key takeaways:
Snap is advancing AR technology with comfortable, self-contained glasses
Hardware limitations remain a challenge for widespread adoption
Practical applications like navigation could drive future consumer interest in AR
Conclusion
The latest developments in computational photography, traditional filmmaking, and augmented reality represent different approaches to advancing visual storytelling and media production. Adobe's Project Indigo shows how software can enhance hardware limitations, while major studios returning to film reminds us of the enduring qualities of analog technologies. Meanwhile, advances in AR glasses point toward a future where digital and physical worlds blend more seamlessly.
For media and entertainment professionals, these trends highlight the importance of understanding both cutting-edge technologies and traditional techniques. The industry continues to evolve in multiple directions simultaneously, offering creators more tools and approaches to achieve their vision.