Background animators become lead animators. First-year lawyers become partners. Production assistants become producers. But if AI eliminates those entry-level jobs, where do the professionals come from?
That question anchored a panel at the American Film Market titled “Visualizing the Future of Commercial Content: What AI Means for the Industry.” Studio executives, producers, and technology leaders addressed not just what AI can do, but what the industry loses when it does it.
The panel was moderated by Nelson Granados, executive director of the Institute for Entertainment, Media, and Sports at Pepperdine University. Panelists included Scott Martin (Aspen IP Consulting, former deputy general counsel at Paramount), Darren Frankel (head of film and television at Adobe), Lori McCreary (CEO of Revelations Entertainment), and Ted Schilowitz (futurist at Cinemmersion Inc.).

Key takeaways from the discussion:
Rogue AI poses bigger threats than production AI: Deepfakes and unauthorized use of talent likenesses come from untraceable sources with no one to negotiate with, unlike legitimate production uses where there's "someone to sit across the table from."
Cost transformation is exponential: Crowd scenes drop from $500,000 for four days to $3,000. Aerial shots fall from $2,000 per hour to hundreds of dollars. The question is whether savings benefit producers or undermine the industry's viability.
The training ladder disappears: AI eliminates background animation, first-year lawyer work, and production assistant tasks that traditionally created the pathway to expertise. Where do lead animators come from if no one does background animation anymore?
Producers face a copyright trap: Pure AI output cannot be copyrighted without "significant human creative intervention." But AI companies only indemnify unaltered output. Producers lose either copyright protection or legal coverage.
59 lawsuits target training, not output: Nearly all pending cases focus on unauthorized use of copyrighted material for AI training. The Anthropic settlement at 1% of company valuation signals systemic undervaluation of content.
Draw the line at ethics, not AI: The industry should separate ethical uses from unregulated "smash-and-grab" operations, not AI from non-AI. Without protecting intellectual property, the industry risks commoditization
Rogue AI vs Legitimate Uses
Scott Martin, who spent 33 years at Paramount and worked on the three-person team negotiating AI issues with unions during the strikes, divided AI concerns into two categories.
"I think about two buckets," Martin said. "One bucket is what I think of as rogue AI: deepfakes, fake porn, fake political speech. And then the other bucket is legitimate, whether it's licensed, whether it's infringing or not, but things that are mainstream."
The real threat to talent is from rogue AI. It's the misuse of their image. And it's the hardest thing to go after because there's no 'they' there. It's somebody in their basement. It's somebody in Russia or Eastern Europe.
With legitimate production uses, there's someone to negotiate with. Martin detailed non-generative uses already in production: color correction, blur effects, audio track cleanup, visual cleanup, preservation, and restoration. In distribution, AI assists with rating classification, localization (determining what changes are needed for different markets), content labeling, asset management, and dubbing.
Generative uses include de-aging and aging performers, voice replication for deceased actors (James Earl Jones and Michael Caine have both made such deals), stunt double integration, and combining live action with CGI.
The Cost Transformation
Ted Schilowitz, who served as futurist at both Paramount and 20th Century Fox, detailed the exponential cost reductions AI enables.
"If you have a budget set for a movie and you have four giant crowd scenes," Schilowitz said, traditional costs were significant. "You have to bring all these extras in. You've got to wardrobe them up." Computer-generated synthetic crowds helped but remained expensive and exotic.
AI can plug that in and say, “I need to make a Civil War scene, and I used to have to spend $500,000 for four days to do that. Now I'm going to spend $3,000 to do that.” Exponentially different in cost.
Aerial shots followed a similar pattern. Real helicopters with pilots and operators cost $2,000 per hour. Drones reduced that by 78%. Now AI can generate aerial shots for hundreds of dollars or less. "I need an aerial shot of the Greek islands, or I need a shot for a murder mystery where I have to fly the camera over this wooded area and see this old rusty pickup truck," Schilowitz explained. "That would be real cost. Now I can plug that into an AI."
He emphasized both sides of the equation: "That is potentially to the benefit of the producers, but also to the risk of the actual viability of our business at a high level."
How Do We Train the Next Generation?
Darren Frankel, head of film and television at Adobe, raised what he called the training ladder problem.
What AI is really good at is the more basic work. If you think about animation, you can use AI to do the background animation, but you're always going to need the lead animator. Well, where does the lead animator come from? They come from learning by doing the background animation.
The pattern extends across the industry. First-year lawyers develop expertise by summarizing depositions, work that AI tools now perform better than humans. Junior animators learn by working on backgrounds. Production assistants gain skills through entry-level tasks.
"If you don't have somebody doing that work, where does the third-year lawyer come from?" Frankel asked. "Where does the level of sophistication and nuance come from when we've taken away that learning curve and that training curve and just asked for an efficiency model, not a training and mentorship model?"
Lori McCreary, CEO of Revelations Entertainment and Morgan Freeman's business partner for 30 years, agreed. "I can't imagine not having had hours and hours sitting in an editing room watching the editor make changes. That's how we learn how things work and how we get better at what we do."
Frankel noted that the film loader position evolved into DIT (digital imaging technician), but questioned what happens when AI eliminates even those transitional roles. "Our industry is built on internships, working your way up. But if you take away the first two rungs of the ladder, how do you work your way up?"
Ted Schilowitz offered a different perspective. A creator in their basement can now accomplish everything that previously required raising money, hiring crews, and managing logistics by "prompting it into a computer and seeing what comes out the other side and then manipulating that and tweaking that."
"There's so many voices out there that might not otherwise have that chance," Schilowitz said. "I think we need to make room for that as well as figure out how to transition our industry."
The Producer's Copyright Conundrum
Lori McCreary detailed a legal problem that affects every producer using generative AI. US copyright law requires "significant human creative intervention" for protection. The term "significant" remains undefined.
Meanwhile, AI companies will indemnify output from legal challenges, but only if it's not altered.
The producer is in a bit of a bind where if we change it, we're not going to be indemnified by the AI company. And if we don't change it, we cannot copyright it and therefore put it into a commercial product.
The burden extends beyond the output. "As a producer, I never used to have to care if a writer sent me a script, how they wrote it," McCreary said. "Now I have to ask what tools they used. And if they used AI, I have to know what that AI was trained on, do they own the output of that AI, and where even the data for that AI company is stored."
Data storage location determines which laws apply, regardless of where the software is used or where the company is housed. McCreary called it "a digital chain of title, not just for the script, but for every little piece of creative content in a film."
The Producers Guild published a guide titled "Fine Print of AI: 10 Questions to Ask Yourself if You're Using AI in a Commercial Product" available on their website.
The Legal Reality: 59 Lawsuits
Scott Martin detailed the litigation wave. As of the panel, 59 lawsuits were pending against AI platforms in the US. Only about 10 focus on output being infringing. The rest target input: the unauthorized use of copyrighted material for training.
The recent Anthropic settlement illustrates the valuation problem. The $1.5 billion settlement represents only 1% of Anthropic's valuation. Per-book compensation netted approximately $3,000.
"If you say the thing that they most need, the most valuable thing is all of your content, is worth 1% of the valuation," Martin said. Many rights holders are opting out of the class action to sue independently, even knowing they may not recover more than the $750 minimum statutory damage. "The answer was, 'It's lowering some laws.' So I opt out as a protest."
The Anthropic case hinged on sourcing books from a pirated website. Martin called the judge's ruling that use of copyrighted material for training is fair use "a disaster."
Martin emphasized global considerations. "The EU regulates and the US litigates. There are only a very small handful of cases in Europe, but they're regulating and legislating all of this. That can have an impact when you've got your product out there and you're trying to distribute it globally."
Regarding copyright registration, producers must now disclose AI elements, similar to disclaiming third-party works or public domain material. "You need to know whether you're going to have to disclose it, whether it's going to be protectable," Martin said. "You have to know whether it is infringing or original to them."
Draw the Line at Ethics, Not Technology
Darren Frankel argued that the industry focuses on the wrong division.
Too many people are drawing the line in the wrong place. They're drawing the line between AI and not AI. The place to draw the line is between ethical and unregulated.
The industry is built on intellectual property. Without protection, Frankel warned, "none of us will have jobs. There will be no industry. It will be commoditized." He cited the newspaper industry's commoditization 25 years ago as a cautionary example.
Unregulated AI platforms operate as "smash-and-grab" operations that don't pay for what they take. "These tools are not inherently good or evil, but they can be used for not good," Frankel said. "Remember, this is an industry that hopefully we all care passionately about. We need to protect it or it won't be here for us."
He warned the audience to watch for a specific phrase in production discussions: "good enough." "There will be a lot of people who go, 'Oh, it lacks this. It lacks that. It's not as good as whatever.' There will be people who are looking at the bottom line who will go, 'Oh, it's good enough.' Keep your ears open for that line."
Lori McCreary reframed the challenge: "Hollywood doesn't need to fight AI. We need to teach AI and the AI companies what integrity is."
What Must Stay Human
The panelists agreed that storytelling must remain human-driven, even as AI handles efficiency and technical tasks.
Darren Frankel put it directly:
There is no art without humanity. You still need humanity in it. AI can create an efficiency. It can solve for lots of problems. But nobody remembers your budget. You remember a good story. And stories without humanity are just computational.
Lori McCreary advocated for "authentic intelligence" instead of artificial intelligence. "If you put the human into AI, into that process, into the workflow, then it becomes authentic."
She noted that after 20 years of working with various digital versions of Morgan Freeman, current AI voice technology still lacks something essential. "You cannot direct an AI voice. You can hit knobs, but you cannot direct the humanity into a performance or a voice right now."
McCreary emphasized the industry's opportunity during transitions. "When we're in the transition, we can help write the rules, and we don't have to wait for somebody else to write the rules for us."
Ted Schilowitz warned about the direction AI platforms are pushing creators. "What these technology companies and their products are trying to do is force you down a pathway to be lazy, to not work as hard, to choose the path of efficiency, to choose the path of cost savings rather than the path of artistry and leveling up."
Their goal is to try and get you to level down. They want you to be part of the soup. They don't want you to be the most delicious ingredient of the soup.
His advice: "You need to figure out how to harness it before it figures out how to harness you. And it is getting really good at figuring out how to harness you."
What Comes Next
The panelists emphasized education as essential for navigating AI adoption. Scott Martin called for education "both of students, but of everyone about what are these technologies, what are the uses, and not racing to embrace it just because it's the cool new technology."
He cited examples of schools banning cell phones and finding students actually talked to each other more, and professors banning laptops after discovering students were taking dictation rather than processing information.
Lori McCreary urged the industry to act proactively. "We really can't wait for Washington or Silicon Valley to regulate for us. If we wait for that, it's too late." She suggested asking AI companies for attribution, to reveal where output comes from and pay for what inspired it.
Darren Frankel noted that trying to hold back technology never works. He pointed to the money being invested: Amazon, Google, Meta, and Microsoft are spending an estimated $364 billion on AI in 2025. "Think of the film and television industry and the size of that. We're not going to hold back that ocean. We have to try to ride that T-Rex or whatever creature it is."
His warning about AI tools: "We do have an AI issue, but I would say the AI problem is really an ownership problem."
McCreary closed with a reminder of what's at stake. "AI can never replace the magic of storytelling. It reminds us that we have to define what makes us truly human. I feel like it can help us make more inclusive and more efficient art. But if we get it wrong, we risk losing the very soul of what it is to be created. AI won't steal Hollywood's soul unless we let it."


