Lightcraft Technology has unveiled significant updates to its Jetset system that bring advanced virtual production capabilities to mobile devices and create new workflows for both live action and animation production. The latest iteration introduces Gaussian splats integration, enabling filmmakers to load entire 3D environments directly on an iPhone while offering closed-loop camera tracking and real-time compositing without external workstations.
Gaussian splats technology is fundamentally changing how filmmakers approach location work by compressing complex 3D environments into phone-compatible formats.
Working with scanning company XGRIDS, Lightcraft can now convert scanned real-world locations into lightweight 3D environments that run directly on mobile devices
Director Roberto Schaefer used this workflow to capture locations in Rome, creating a virtual production environment that enables precise shot planning before arriving on location
The technology bridges the gap between expensive studio-based virtual production and accessible pre-production tools, allowing filmmakers to "pre-shoot" their projects
Lightcraft has coined the term "story hacking" to describe an innovative workflow that fundamentally changes the animation development process by leveraging virtual production earlier in the creative process.
Director Harald Zwart used Jetset to block and shoot his entire animated feature Viqueens with live actors in just 10 days, before committing to costly animation
The workflow combines vocal performances, mime actors for physical movement, and basic 3D environments to create a complete preview version of the film
By integrating with Wonder Studio's AI animation system, the captured performances provide animators with precise camera data and performance references, reportedly cutting two years off the traditional animation pipeline
The approach allows filmmakers to test, edit and refine story elements at an early stage when changes are less expensive
The new Jetset Cine feature connects professional cinema cameras with the iPhone tracking system for higher quality real-time compositing.
Live feeds from cameras like RED can now be processed through the Accsoon SeeMo system and composited directly inside the iPhone
Time-code synchronization ensures seamless integration with post-production workflows
Integration with Aximmetry broadcast compositing software enables 4K real-time rendering for final-pixel quality
Independent filmmakers are already using these tools to create professional-looking VFX-heavy productions with minimal crew and equipment
This technology represents more than just new filmmaking tools - it's reshaping who can create sophisticated visual content and how production processes function.
The cost barrier for professional-looking VFX is dramatically lowering, allowing indie creators to compete with studio-level visual sophistication
"Full stack creatives" who understand both technical and storytelling elements will benefit most from these integrated tools
Traditional production roles are blurring as technology enables smaller teams to handle what previously required separate departments
The next generation of filmmakers will grow up with these capabilities as their baseline, potentially leading to entirely new visual languages and storytelling approaches
As Lightcraft founder Eliot Mack notes, "The key thing is that visual effects has historically been crazy heavy. Being able to do this means if it's lightweight, people make amazing stuff out of it."
Reply