By Gonçalo Perdigão
Copyright observer
The cost of making high-quality media is collapsing. The cost of getting anyone to care about it is not. As generative A.I. turns production into a near-commodity, cultural power is shifting from studios and galleries to the platforms that allocate attention and the algorithms that determine who gets paid. The new patrons are not moguls with checkbooks; they are recommendation systems tuned for engagement and brand safety.
Production is cheap; distribution is scarce
Video models now draft storyboards, generate shots and remix audio at consumer scale. Yet the money still follows distribution, not tools. On YouTube, the rules of the YouTube Partner Program, set and revised unilaterally, determine whether a creator receives 55 percent of watch-page ad revenue for long-form content and 45 percent for Shorts. Those headline rates are stable, but the platform’s enforcement posture has shifted: as of July 15, YouTube began tightening monetization against “inauthentic” or mass-produced A.I. content, a clarification aimed at the surge of spammy, low-effort videos. The message is clear: use A.I. to enhance originality, not to flood the feed.
The enforcement problem is real. “Cheapfake” celebrity clips—static images, synthetic narration and rage-bait scripts—have racked up views while confusing audiences. YouTube has removed channels and now requires disclosure labels for realistic synthetic media, but detection and policing remain uneven at scale.
Platforms are recoding payouts and power
Spotify’s 2024 royalty overhaul illustrates how platform rule-sets become policy for the creative middle class. Tracks now require at least 1,000 streams in 12 months to pay out; functional “noise” content is throttled; and labels face fees for detected artificial streaming. The goal is to redirect the pool away from bot farms and sub-cent trickles. The effect is a re-concentration of earnings at the head of the curve and a higher bar for the long tail. When platforms change the taps, whole genres feel the drought or the deluge.
TikTok’s détente with Universal Music in May 2024 underscored the same power dynamic in short-form video. After months of public sparring over royalties and A.I. clones, a new licensing deal restored UMG’s catalogue to the app, alongside language about improved remuneration and protections against generative knock-offs. When distribution is the choke point, even the largest rights-holders must negotiate on platform terms.
Data deals: the new studio lots
If attention is one axis of the new patronage, training data is the other. The most lucrative cultural contracts of the past year were not output commissions but input licences. OpenAI’s run of publisher agreements, including the Associated Press (archives), Axel Springer, the Financial Times and a multi-year global deal with News Corp, reportedly worth more than $250 million, signals a market price for premium corpora. A.I. labs are paying for access, and the beneficiaries are large, well-structured repositories of rights, not individual creators.
The legal battles surrounding image training demonstrate the unsettled state of the rules. Getty Images narrowed its U.K. lawsuit against Stability A.I. in June, dropping core copyright claims while pressing trademark-style arguments about reproduced watermarks. The pivot reflects the complexity of proving training-stage infringement across borders, as well as the industry’s search for more predictable routes to compensation.
Regulation is standardizing transparency and shifting risk
Rules are arriving, and they read like operating manuals for platformized culture. The E.U.’s A.I. Act phases in obligations for general-purpose models, with guidance for “systemic-risk” providers by 2025 and a Code of Practice outlining requirements for transparency, copyright diligence and safety. In effect, document training, assessing model risks, publishing technical summaries and preparing for audits are all tasks that privilege firms and partners with a strong compliance presence.
In the U.S., the Copyright Office’s multipart A.I. study is moving from theory to guidance. Part 2 (January 2025) addresses whether and when A.I.-assisted outputs can be copyrighted, while the pre-publication of Part 3 (May 2025) examines training and how to reconcile text-and-data mining with compensation. The studio system, once established, created creative norms through collective bargaining; now, regulators and A.I. vendors are co-authoring the manual.
Unions are also imposing guardrails. The WGA’s 2023 deal barred studios from treating A.I.-generated material as “source material” and protected writers from being required to use A.I.; SAG-AFTRA’s agreements introduced consent and compensation for digital replicas, with similar provisions in music. These are not abstractions; they are hard-coded constraints on how platforms and producers can deploy synthetic labour.
Provenance becomes product
As synthetic media scales, provenance is turning into both a feature and a bargaining chip. TikTok has begun automatically labelling A.I. assets imported from tools that support C2PA Content Credentials. YouTube now requires creators to disclose realistic synthetic edits. Meanwhile, device makers are integrating C2PA into the capture pipeline, with Google’s Pixel 10 embedding credentials in its camera output. OpenAI, for its part, adds C2PA metadata to DALL-E images. Attribution is becoming clickable.
The provenance layer will not solve misinformation alone. Metadata can be stripped, and enforcement lags, but it rewires incentives. Platforms can boost authentic, labelled media in feeds, penalize evasions and share “credibility signals” with advertisers. That is algorithmic patronage by another name.
What shifts next
Studios and galleries will increasingly resemble platforms. Owning release windows is no longer enough. Expect investments in first-party audiences, data clean rooms and rights bundles that can be licensed to model providers. The historic advantage, taste and talent pipelines must be coupled with distribution levers and data assets. Deals will include not just streaming residuals but “model-weight” royalties and retraining rights, mirroring the structure of today’s publisher licences.
Creators will face algorithmic wage setting. Eligibility thresholds (1,000 Spotify streams), demonetization triggers (unoriginal Shorts), disclosure requirements (synthetic media labels) and fraud detection fees are becoming the effective tax code of digital culture. The prudent strategy is to diversify revenue streams, ads, direct fan funding and commerce, and to instrument provenance by default to stay on the right side of both algorithms and regulators.
Policy, too, will reward those who can comply. The E.U. framework, the U.S. copyright study, and union clauses collectively nudge the market toward licensed inputs, documented outputs and consent-based replication. Those advantages include larger catalogues and well-capitalized intermediaries. For independent creators, collective licensing pools and guild-run registries may offfer the path to negotiating power.
The arts has seen patronage shift before, from courts to salons to art galleries and museums. This time, the median patron is a ranking function. Where culture is made matters less than where it is surfaced, metered and paid. Those who understand the incentives embedded in platform policy, and can prove provenance at the speed of the feed, will capture the surplus. Everyone else will be producing to spec for someone else’s algorithm.