Hollywood's AI Licensing War: The Billion-Dollar Deals, Strikes, and Lawsuits Reshaping Who Owns What
In the span of twelve months, Hollywood went from treating AI as an existential enemy to signing billion-dollar deals with the companies building it. And somehow, nobody seems happy about it.
Disney invested $1 billion in OpenAI and licensed 200 characters to Sora. Warner Music settled its lawsuit with Suno and signed a licensing deal. Studios are quietly using AI in postproduction while publicly reassuring unions they'll protect jobs. Meanwhile, SAG-AFTRA's contract expires in mid-2026, the WGA's on May 1, and everyone involved knows that AI will dominate those negotiations.
This isn't just a Hollywood story. The licensing frameworks being built right now — in courtrooms, union halls, and corporate boardrooms — will define how AI content works for everyone. If you're a creator using AI tools in any capacity, what's happening in entertainment is a preview of the rules coming to your industry.
The Disney-OpenAI Deal: A Blueprint, Not Just a Partnership
When Disney announced its $1 billion investment in OpenAI in December 2025, the reaction split cleanly in two. Wall Street saw validation. Hollywood saw betrayal.
The deal's structure is worth understanding because it will likely become the template for every major IP holder negotiating with AI companies. Disney licensed over 200 characters from Disney, Pixar, Marvel, and Star Wars to Sora for a three-year term. Sora users can generate short videos featuring those characters — costumes, props, vehicles, environments, animated forms. ChatGPT can generate still images from the same library.
But the deal has sharp boundaries. No human likenesses. No actor faces. No voices. Disney deliberately stays away from the SAG-AFTRA minefield, at least for now. You can generate Darth Vader's helmet, but not the face of any actor who's played the character. Iron Man's suit, but not Robert Downey Jr. This creates a new legal distinction that didn't exist before: character IP separated from performer attributes.
The hypocrisy wasn't lost on anyone. Just one day before announcing the OpenAI deal, Disney sent Google a cease-and-desist letter alleging massive copyright infringement from AI training. Disney has also sent similar letters to Meta and Character.AI, and filed litigation against Midjourney. The strategy is now explicit: license to partners who pay, sue those who train without permission.
For smaller AI companies, the implications are brutal. If the price of using recognizable IP legally is a billion-dollar equity deal, then Midjourney, Runway, Pika, and every other startup in the space is operating in a fundamentally different risk category than OpenAI. The era of training on everything and asking forgiveness later appears to be ending — but only for those who can't afford to write the check.
The SAG-AFTRA Showdown
Negotiations between SAG-AFTRA and the studios (AMPTP) began on February 9, 2026. The current contract doesn't expire until June 30, but the stakes are already clear.
The 2023 strike lasted 118 days and resulted in a contract valued at $1.1 billion that included what were considered strong AI protections at the time. Those protections look incomplete two years later.
The core issue hasn't changed: studios want the flexibility to use AI in production; performers want to ensure AI doesn't replace them or use their likenesses without consent and compensation. But the specifics have evolved dramatically.
The biggest new proposal on the table is what's been called the "Tilly Tax" — a requirement that studios pay the union a royalty whenever they use a fully synthetic (AI-generated) actor in a production. The logic is elegant: if using a fake actor costs the same as (or more than) hiring a real one, studios will almost always choose the human. It's an economic deterrent designed to make AI replacement unattractive, rather than trying to ban it outright.
Beyond the Tilly Tax, SAG-AFTRA is pushing for transparency requirements around AI training data, stronger consent frameworks for digital replicas, and protections against the use of performers' likenesses in AI-generated content across platforms — including tools like Sora.
The union has a new president, Sean Astin, and the AMPTP has a new lead negotiator, Greg Hessinger, who spent years on the union side. Both sides are signaling a desire to avoid another prolonged strike. But the distance between their positions on AI may be wider than it was in 2023, because the technology has advanced so much faster than anyone expected.
The Job Losses Are Already Here
The debate about whether AI will displace Hollywood jobs is already academic. It's happening.
Los Angeles County has lost 41,000 film and TV jobs in three years — a quarter of its entertainment workforce. Writing gigs fell 42% from 2023 to 2024. A study commissioned by the Concept Art Association and the Animation Guild surveyed 300 industry leaders and found that three-quarters indicated AI tools already supported the elimination, reduction, or consolidation of jobs at their companies. The study estimated that nearly 204,000 positions will be adversely affected over the next three years.
The jobs hit hardest aren't the ones that make headlines. It's not A-list actors or showrunners — it's concept artists, 3D modelers, sound editors, compositors, background performers, and entry-level positions across postproduction. The people who built the visual effects in your favorite Marvel movie. The artists who designed the characters before a single frame was animated.
Former DreamWorks founder Jeffrey Katzenberg publicly stated that AI will replace 90% of jobs on animated films. Whether that number is accurate or aspirational, it reflects how studio executives are thinking about the economics of production.
This is the tension at the heart of Hollywood's AI reckoning: the people making the deals aren't the people losing the jobs. Disney's $1 billion flows to OpenAI, not to the concept artists whose roles are being consolidated. Warner Music's licensing deal with Suno settles a lawsuit, but it doesn't create new work for session musicians.
The Seedance Moment
If Disney-OpenAI represented the "legitimate" path forward for AI content, Seedance 2.0 represented the alternative.
When ByteDance launched Seedance in February 2026, users immediately generated videos of Spider-Man, Darth Vader, and a widely-shared clip of Tom Cruise and Brad Pitt fighting over a fictional Jeffrey Epstein plot. The video spread across social media faster than any AI demo in history. It also prompted immediate legal action from Disney, Netflix, Paramount, Sony, and the MPA.
ByteDance promised guardrails. But the damage was already done — the underlying models had already been trained on the content, and no filter can perfectly prevent a neural network from reproducing what it's learned. The incident crystallized the industry's fundamental problem: you can't un-train a model.
For creators, Seedance is a cautionary tale. Even though commercial licenses technically exist on Seedance's Pro plans, the legal exposure from using content generated by a model under active litigation from every major studio is enormous. The platform's technical capabilities are impressive. Its licensing safety is close to zero.
The Music Industry Got There First
While Hollywood debates, the music industry has already been through its version of this fight — and the outcomes offer a roadmap.
Universal, Warner, and Sony all sued Suno and Udio for training on copyrighted music without licenses. Both companies settled. But the settlements went in opposite directions.
Udio pivoted entirely. It became a "walled garden" — a fan engagement platform where users remix licensed music, but nothing leaves the platform. It's no longer a music creation tool in any meaningful sense.
Suno kept its core product intact but agreed to license training data going forward and require paid subscriptions for downloads. The trade-off: Suno users can still create and distribute music commercially, but the ownership language shifted. Suno now technically retains authorship while granting users a perpetual commercial license. You can make money with it, but you don't "own" it the way you used to.
This pattern — settle the lawsuit, sign a licensing deal, change the terms — is almost certainly what we'll see play out in video over the next 12-24 months. The studios suing Runway and the MPA going after Seedance aren't trying to kill AI video. They're establishing the leverage to negotiate licensing terms on their conditions.
What the Audience Actually Thinks
Here's the piece that gets less attention than the boardroom deals: audiences aren't sold yet.
A post-Super Bowl 2026 survey of Gen Z and Gen Alpha consumers found that AI-themed advertising during the game "missed big time." Respondents reacted negatively to ads with visible AI messaging. Director Daniel Kwan, who helped launch the Creators Coalition on AI, told a Sundance panel that the assumption that AI content is inevitable is wrong — and that filmmakers shouldn't let the tech industry set the terms for their industry.
Guillermo del Toro said he'd rather die than use AI in his films. Rian Johnson called the technology something that's "making everything worse in every single way." At a Marrakech film festival, directors Bong Joon Ho and Celine Song both spoke out against AI use in filmmaking.
This matters for licensing because audience sentiment creates market risk. If consumers associate AI-generated content with low quality or ethical compromise, then brands using AI content face reputational exposure that no licensing agreement can solve. The legal right to use something commercially doesn't guarantee that customers will respond well to it.
What This Means for Creators Outside Hollywood
If you're not in the entertainment industry, you might wonder why any of this matters to you. It matters because Hollywood is where AI licensing frameworks get stress-tested at the highest stakes.
The consent frameworks being negotiated by SAG-AFTRA will influence every platform's terms. When the union establishes rules about digital replicas and synthetic performers, those standards will ripple into how ElevenLabs handles voice cloning, how Midjourney handles likeness generation, and how every AI platform thinks about the humans whose work trained their models.
The Disney licensing model will be copied. If character IP can be separated from performer attributes and licensed to AI platforms, expect every major brand, sports league, and media company to explore similar deals. The licensing infrastructure being built right now will extend far beyond entertainment.
The "Tilly Tax" concept could go broader. If unions can establish that AI-generated content should cost the same as human-created content, that principle could affect pricing and licensing across creative industries — from stock photography to music libraries to voice-over services.
The copyright question is getting closer to resolution. Every lawsuit, settlement, and legislative proposal brings more clarity to the question of whether AI outputs can be copyrighted and who's liable when AI generates infringing content. What's settled in Hollywood will shape how copyright offices and courts treat AI content globally.
Where It's All Heading
The outlines of the new system are becoming visible:
Licensed training data becomes the norm. The era of training on everything without permission is ending for well-funded companies. Smaller companies may continue to operate in gray areas, but they'll face increasing legal risk.
Tiered commercial rights, tied to plan level and use case. The free-for-all is over. Commercial use requires paid plans, and increasingly, specific use cases (advertising vs. personal vs. editorial) will have different license terms.
Mandatory AI disclosure. Streaming platforms already require it. The EU AI Act mandates it for certain content types. Transparency requirements will only expand.
Human creative input as a legal and market differentiator. Adding substantial human creativity to AI outputs isn't just a copyright strategy — it's becoming a quality signal that audiences, clients, and platforms will reward.
Consolidation. Smaller AI companies that can't afford billion-dollar licensing deals will either get acquired, pivot to niches, or face litigation they can't survive. The market is moving toward a handful of platforms with legitimate IP partnerships.
Stay updated on licensing changes
Get notified when licensing terms change, plus receive exclusive deals and coupons from top creative platforms. No spam, unsubscribe anytime.
Staying Informed
The Hollywood licensing landscape changes weekly. New lawsuits, settlements, and deals reshape what's possible and what's legal for creators using AI tools. We track the licensing terms for every major AI platform — images, video, music, and voice — and update our coverage as terms change.
For platform-specific guidance, explore our searchable licensing guide or dive into our detailed posts on Sora licensing, AI video licensing, and commercial use across all AI platforms.
Related Articles
The Complete Guide to AI Commercial Use in 2026: Images, Video, Music, and Voice
Every major AI platform's commercial licensing terms in one place. Who lets you sell what you make, who doesn't, and where the legal gray areas are — across images, video, music, and voice.
Adobe Firefly Indemnification: What It Means and Why It Matters
Adobe Firefly is the only major AI image tool that offers IP indemnification. Here's exactly what that means, which plans include it, and when it actually protects you.
Grok Image Generation: Can You Use AI Images Commercially? (xAI Policy Explained)
xAI's Grok lets you generate images with Aurora — but what does the license actually allow for commercial use? We break down the policy, the risks, and what's missing.