CAA Tells OpenAI: Pay the People Who Built Hollywood Before Your Bots Replace Them
The powerhouse agency pushes regulators and studios to guarantee consent, credit, and compensation as Sora’s viral video tool tests how far creators’ rights can bend before breaking.
Photo by _zash_ capturing on Unsplash
Creative Artists Agency has entered the AI debate with unmistakable force, warning that OpenAI’s video-generation tool Sora threatens the rights and livelihoods of the artists it represents. In a statement shared with clients and industry partners, CAA urged regulators, unions, and studios to adopt enforceable rules around credit, consent, and compensation before generative video becomes a standard part of production and marketing pipelines.
Sora, which launched in the U.S. and Canada in September, allows users to create short, cinematic clips from text prompts—complete with lighting, camera movement, and stylistic mimicry of known filmmakers. The tool’s speed and polish have impressed technologists and alarmed creatives in equal measure. CAA’s concern is clear: without upfront licensing frameworks and attribution systems, generative video risks normalizing an extractive economy that flattens authorship and funnels revenue away from the human talent that built Hollywood’s ecosystem.
Representing thousands of artists across film, television, music, and sports, CAA’s stance carries unusual weight. The agency is in discussions with both unions and policymakers to secure protections that “travel with the work,” including watermarking, provenance metadata, and standardized revenue-sharing protocols for outputs that rely on copyrighted material or recognizable likenesses. OpenAI has promised new content controls and opt-out mechanisms for rightsholders, but CAA’s statement underscores a widening gap between technological progress and contractual enforcement.
The threat is no longer hypothetical. In the weeks surrounding Sora’s launch, creators flagged AI-generated videos that borrowed heavily from existing intellectual property—shots echoing blockbuster franchises or mimicking performers without consent. These incidents may skirt legal definitions of infringement but still undercut professional labor. For actors, cinematographers, and VFX artists, the danger lies in an invisible substitution effect: studios and brands replacing teams of craftspeople with a few well-tuned prompts.
CAA isn’t calling for a ban, but for terms—a defined set of economic and ethical baselines. Those could include do-not-train registries for individual clients, mandatory attribution for AI-generated works resembling protected IP, and automated takedown systems capable of operating at the same speed as the models themselves. The agency is also advocating for minimum pay scales before AI video floods the short-form market. North American digital ad spending exceeds $70 billion annually; even a small pivot toward synthetic production could reroute millions from human editors, voice artists, and colorists.
The agency’s argument aligns with regulators exploring targeted oversight rather than blanket restrictions. CAA’s message to lawmakers is pragmatic: innovation isn’t the enemy, but creators must remain participants in the economy their data and likenesses help fuel. “The future of storytelling can’t be written without the people who built it,” one CAA executive said privately.
For now, Sora remains both a breakthrough and a flashpoint—a technology capable of democratizing filmmaking or displacing the very crafts that define it. CAA’s intervention signals that Hollywood’s most influential broker intends to make sure the balance tilts toward consent, credit, and compensation, not replacement.