An AI-generated actress signing with talent agencies makes for an unsettling headline. In this episode of Denoised, hosts Addy and Joey break down three stories that matter to filmmakers and VFX teams: the buzz around AI performer Tilly Norwood, the messy realities behind Runway’s marquee partnership with Lionsgate, and practical examples of VFX artists blending AI tools with Nuke to solve real production problems.
Why Tilly Norwood set off the industry alarm bells
Tilly Norwood, an AI-born influencer developed by a U.K.-based team, recently made headlines after its creators indicated talks with talent agencies. The reaction wasn’t purely about novelty — it cut to the industry’s core anxiety: where does synthetic media sit alongside actors represented by unions, and what rights or protections should apply?
Key details and context:
The project appears to be produced by Particle6 and is pitching typical AI-driven services: avatars, AI VFX, and "drama reconstruction." The team is small but actively hiring for roles like an AI creative producer and researcher-developer.
Tilly’s public presence is modest by influencer standards — roughly 16,000 Instagram followers — but the creators marketed her as an “AI actress,” which triggered stronger reactions than if they had used “AI influencer” or “AI character.”
The conversation in Denoised focuses on intent versus execution. The creators are explicitly labeling Tilly as an art project — “She is not a replacement for a human being, but a creative work, a piece of art.” That framing echoes an established defense: AI as a new tool in the storyteller’s toolbox. But producers and creatives in the room raised practical counterpoints:
Authenticity and audience attachment often stem from human presence: live events, spontaneous reactions, and real-life vulnerability create trust and loyalty that synthetic characters struggle to replicate.
Fully synthetic personas can succeed in limited commercial campaigns or short-lived hype cycles, but long-term viability depends on narrative depth and consistent character-building — not just photoreal faces on auto-generated posts.
Investors may pour money into synthetic IP quickly, creating short-lived booms (the Bored Ape parallel was raised). That can create speculative value that evaporates if the brand lacks meaningful engagement.
For filmmakers and brand managers, the takeaway is practical: synthetic characters will continue to proliferate, but differentiation will come from strong storytelling and a clear product-market fit. If a synthetic character is aimed at commerce, its creators must operate with transparent intent, strong design, and a sustainable audience plan beyond initial investor hype.
Runway x Lionsgate: A reminder that tech enthusiasm meets legal and creative complexity
When Runway announced a partnership with Lionsgate to leverage the studio’s catalog for custom AI models, the press release read like a production team’s fantasy: repurpose franchises, generate storyboards and trailers, or even re-spin an IP into a different genre (yes, someone actually suggested turning a John Wick–like franchise into a PG-13 anime in hours).
Data sufficiency and variety: A studio library that spans genres is not a single “look” you can easily train a model on. Lionsgate’s catalog is diverse; it doesn’t exhibit a consistent visual language in the way an auteur studio might.
Rights complexity: Every film has unique contracts covering actors, composers, directors, and ancillary rights. Ingesting a film’s footage for training raises questions about whether the studio or the talent has authority to permit that use — a thorny legal challenge.
Tooling and specialization: A single vendor’s model won’t cover every use case. Production workflows often require multiple specialized models or open-source workflows that expose every step (e.g., Comfy UI pipelines used by independent VFX specialists).
Technically, a foundation model trained on billions of images is what generally powers high-quality image and video generation. From there, smaller, targeted fine-tunes (often called LoRAs or similar) can steer the model toward a specific style using only a few dozen to a few hundred reference images. But that workflow assumes access to consistent, high-quality reference data and tolerant stakeholders.
What studios should consider:
Start narrow: use AI to accelerate specific, bounded tasks — previs, concept art, poster mockups, or shot-specific VFX insertions — rather than attempting whole-feature generation.
Maintain transparent rights practices: clarify what footage is used for training and secure explicit permissions from talent and other rights holders.
Adopt hybrid workflows: combine a reliable vendor (or vendors) with in-house VFX knowledge so studio creative teams retain control and observability of the pipeline.
Runway’s tooling is powerful and pioneering — it’s paving paths for AI-assisted VFX — but the partnership shows how nontechnical constraints (contracts, creative expectations, and internal organization) can limit execution. For production leads, the smarter strategy is incremental adoption and clear guardrails rather than all-in proclamations.
Real VFX artists using AI + Nuke: practical demos and workflow lessons
Where the conversation gets hopeful is in hands-on experiments by experienced VFX supervisors like Freddy Chávez Olmos. He rebuilt three classic VFX problems — relighting and head rotation, replacing a stand-in with an animal, and de-aging a performer — using a hybrid stack: Wan 2.2-Animate, Nano Banana, Beeble, and Nuke.
Why these examples matter to working teams:
AI is practical when it’s focused: Instead of attempting to generate an entire film, these workflows targeted specific VFX tasks and integrated outputs into traditional compositing pipelines.
Nuke is still doing the heavy lifting: Compositing finesse — feathering, shadow matching, edge control, and color integration — remains essential. AI provides outputs that Nuke refines and stitches into a final plate.
Hybrid outputs can be high-quality: De-aging or animal replacements created this way can look convincing when lighting, shadow, and camera perspective are matched. The examples shared show convincing results that would be acceptable for many broadcast or streaming use cases.
Typical hybrid workflow (as observed):
Capture reference motion and plates (actor performances, rigs, or mocap).
Use an AI generator for appearance generation — a young face, a photoreal animal, or a relit head pass.
Composite the AI-generated layers in Nuke to match the lens, grain, shadows, and reflections of the original plate.
Iterate: tweak inputs (lighting references, more frames) for better frame-to-frame consistency and hand off to color/finish.
These case studies point to two practical conclusions for production teams:
If a department wants immediate cost and time gains, identify narrow VFX tasks that are routine, high-volume, and well-bounded. AI can rapidly reduce manual labor there.
Invest in skilled compositors and VFX supervisors who understand both the technical and aesthetic trade-offs. AI outputs need human oversight to pass as “real” on screen.
What this all means for creators, studios, and VFX houses
The three stories together paint a pragmatic landscape: synthetic personas will proliferate, studio-vendor marriage plans will collide with legal and creative reality, and skilled artists are already inventing hybrid workflows that leverage AI where it helps the most.
Actionable guidance for professionals:
For producers and showrunners: Start small with AI. Prioritize tasks where the ROI is measurable — previs, poster iterations, rotoscoping, background plate fixes — and maintain strict legal clearance processes.
For VFX supervisors and compositors: Learn one or two AI generation tools and keep sharpening Nuke skills. Compositing will remain the quality control gate.
For talent managers and agents: Treat synthetic personas as IP and brand projects, not direct substitutes for physical actors. Contracts, disclosure, and ethical guardrails matter.
For studios and executives: Don’t oversell the “one-click movie” narrative. Invest in pilot programs, measure outcomes, and build an observability layer around any AI tool used in production.
Final thought
The latest wave of AI tools is not a replacement for the rigors of filmmaking; it’s another set of instruments in a well-stocked kit. The companies who win will be the ones that thoughtfully combine creative judgment, legal clarity, and technical craftsmanship. When AI is used to solve precise creative problems — and when experienced artists own the final decisions — these tools will accelerate workflows and expand creative possibilities without abandoning the values that make cinematic storytelling compelling.





