CAA Tells OpenAI: Pay the People Who Built Hollywood Before Your Bots Replace Them

The powerhouse agency pushes regulators and studios to guarantee consent, credit, and compensation as Sora’s viral video tool tests how far creators’ rights can bend before breaking.

Thomas Wolf, www.foto-tw.de, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons

The industry’s most powerful broker enters the AI fray with a demand for hard terms, not platitudes: enforceable attribution, licensing, and revenue-sharing guardrails tied to prompts, models, and outputs, or risk normalizing an extractive, credit-flattening video pipeline.

The creative community’s most powerful broker stepped into the AI fray today, as Creative Artists Agency (CAA) warned that OpenAI’s newly launched video tool Sora threatens the rights and livelihoods of the artists it represents. Sora, released in the U.S. and Canada in September, lets anyone generate short videos from text prompts—highly polished clips that can mimic familiar characters, styles, and even the cinematic language of specific filmmakers. CAA’s message: unless credit, consent, and compensation are baked in from the jump, Sora could accelerate an extractive economy that treats creative work as raw material and creators as optional.

CAA represents a who’s who across film, TV, music, and sports, so its posture matters. The agency is in active talks with unions and policymakers to secure protections that travel with the content: not just watermarking and provenance labels, but licensing and revenue-share frameworks that ensure downstream use is traceable and payable. OpenAI has said content controls and revenue-sharing options are coming, but CAA highlights the gap between promise and enforcement—especially as studios test AI across pre-vis, localization, and marketing.

The risk isn’t theoretical. In the weeks around Sora’s launch, Hollywood crews cataloged instances of IP lookalikes and unlicensed stylistic borrowing—examples that might pass fuzzy legal tests but fail practical ones: fewer jobs for human craftspeople and less leverage for performers whose likeness can be simulated. CAA also nods to the credit problem. Generative clips absorb decades of technique—production design, lighting, choreography, score temping—yet return a flattened author field. For an ecosystem where credits are currency, a tool that collapses authorship presents a real earnings threat.

CAA isn’t calling for a ban. It’s calling for terms. In practice, that could look like standardized do-not-train registries for individual clients; compulsory attribution fields for works that resemble protected IP; and automated takedown pathways moving at algorithmic speed. There’s urgency to set baseline rates. If AI videos proliferate into advertising, music promos, and social placements—well before premium streaming—CAA wants enforceable floors so budgets don’t race to the bottom. Consider the scale: short-form digital ad spend in North America sits in the tens of billions annually; even a small substitution effect toward AI video could reroute millions away from crews, editors, colorists, and voice actors.

The agency’s framing may resonate with regulators who favor targeted guardrails over blunt bans. Sora will keep evolving; so will Hollywood’s use of it. CAA’s intent is to make sure the future isn’t written without the people who made the movies and shows audiences love—starting with consent, credit, and real money whenever their work or persona is in the loop.

Next
Next

Sora 2 and the Day Hollywood Went to War