Digital Sharecroppers Cannot Be Saved by Better Landlords
AI content theft is not a side effect of the platform economy — it is the platform economy. The only defence is ownership of the audience relationship itself.
Somewhere right now, someone is asleep in front of a YouTube video they will never finish. The video is four hours long, narrated in a breathy ASMR whisper — engineered not for comprehension but for unconsciousness. An ad plays. The sleeper does not skip it. Another ad plays. The sleeper does not skip that one either. This is the business model.
The video is stolen. Its script was lifted from an eleven-minute video made by a creator who spent weeks researching and recording it. An AI pipeline ingested the script, padded it with filler through a large language model, generated the whispered narration, and uploaded the result to a channel that accrued 650 subscribers in three months — faster than most human creators ever manage. The original creator’s work is now an input in someone else’s revenue stream. The platform takes its cut regardless. This is not a content moderation failure that better enforcement will fix. It is the platform economy working as designed, and the only creators who will survive are the ones who stop farming someone else’s land.
The slop economy is not a side effect
The scale is no longer anecdotal. Research by the video platform Kapwing in late 2025 examined 15,000 of YouTube’s most popular channels and found 278 producing nothing but AI-generated content.1 Those channels have accumulated roughly 63 billion views and an estimated $117 million in annual ad revenue. According to the same research, one in five videos recommended to a fresh YouTube account is entirely AI-generated, and another third qualifies as what the industry calls “brainrot” — generic, algorithmically optimised filler. YouTube made $36.1 billion in ad revenue in 2024, up nearly fifteen percent year on year.2 The slop economy is not a drag on that business. It is a component of it.
YouTube has begun responding. In early 2026, it removed or wiped sixteen AI-driven channels with a combined 35 million subscribers and 4.7 billion views.3 Two channels that mass-produced fake AI movie trailers — splicing generated footage with copyrighted material to bury real trailers in search results — were terminated after press investigation. YouTube’s CEO, Neal Mohan, announced plans to curb low-quality AI content in his 2026 priorities letter.4 The crackdown is real.
Enforcement cannot outrun the incentive
The crackdown is also structurally inadequate. The same CEO has called generative AI “the biggest game-changer for YouTube since the original revelation that ordinary folk wanted to watch each other’s videos.” YouTube is simultaneously encouraging creators to adopt AI tools, building features for AI-generated likenesses, and positioning artificial intelligence as the platform’s future. The platform needs volume to sell ads against; AI produces volume. The platform needs quality to maintain advertiser confidence; AI degrades quality. These imperatives cannot be reconciled through moderation policy. You cannot enforce your way out of an incentive structure that rewards the behaviour you are trying to prevent.
The sleep-ad scheme is worth understanding because it reveals the system’s logic with particular clarity. Longer YouTube videos carry more ad slots, and each ad pays more when it runs to completion. The ideal viewer, from a revenue standpoint, is one who never skips. A sleeping viewer never skips. ASMR content is designed to induce sleep. Combine this with niche communities whose creators speak in calm, measured tones — video game lore, legal commentary, history — and you get a business optimised for unconscious consumption. The AI does not need to produce good content. It needs to produce long, soporific content that keeps a browser tab open while ads play into the dark. YouTube knows this is happening. In 2024, it began testing a sleep timer that pauses playback after a set period — an implicit admission that its ad system was collecting revenue from unconscious viewers.5
The creator at the centre of this example — who goes by the name Scum Mage and produces lore videos about the game Elden Ring — discovered the theft when a viewer flagged it. His eleven-minute video had been copied beat for beat, padded to four hours, and delivered in the characteristic whisper. He also found his voice had been cloned to narrate stolen content on Bilibili, the Chinese video platform, in Mandarin. Other creators in his niche had been similarly targeted. He found the same pattern applied to a legal commentator whose consistent tone and facial expressions made him easy to clone, and whose topical content translated readily across languages.6 The theft is both domestic — scripts scraped, elongated, re-uploaded within the same platform — and international, with voice cloning and translation enabling cross-border content laundering at industrial speed.
Content theft is not new. Piracy predates the printing press. But scale changes the nature of the problem. A human bootlegger rips one video. An AI pipeline can process an entire channel’s back catalogue in an afternoon, transform the output to evade detection, translate it into a dozen languages, and have it earning ad revenue by evening. The fake-news entrepreneurs of Veles, Macedonia demonstrated in 2016 what happens when ad-funded distribution meets amoral enterprise at human speed — teenagers built fake American political news sites that earned enough to buy houses, funded by Facebook ad clicks.7 AI collapses the cost of that arbitrage toward zero. India’s top AI slop channel has passed 2.4 billion views, in a country where average annual income is about $2,400.8 The economic incentive to steal, process, and redistribute now overwhelms any moderation regime.
The sharecropper’s bargain
In 2006, the writer Nicholas Carr coined “digital sharecropping” to describe what he saw as Web 2.0’s defining economic feature: “the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few.”9 The metaphor drew on the post-Civil War American South, where freed slaves and poor whites worked land they did not own and received a fraction of the value they produced. The owners collected the rent. The sharecroppers were free in theory and dependent in practice.
Nearly twenty years later, the metaphor has sharpened. YouTube has 115 million channels. About three million are monetised. YouTube takes roughly 45 percent of ad revenue before any creator sees a cent.10 The platform owns the distribution, controls the algorithm, sets the revenue split, and can change any of those terms at will. Creators who build audiences on YouTube do not own those audiences — they rent access to them, and the rent is denominated in algorithmic favour that can be withdrawn without notice. This was tolerable when human creators were the only ones producing content. It becomes untenable when the landlord’s machine can do the same work a thousand times faster, and the landlord has no structural reason to prefer human hands.
Direct audience relationships — behind a paywall, an email list, a moderated community — are the only durable foundation for creative work when AI can replicate anything on the open internet. But acknowledging this means telling people that the economy they were promised does not work, and they should build a different one. Most of YouTube’s 69 million creators will never build a subscription audience. Many did everything right within the system they were given. The system was the problem.
The hardest counterargument is simpler: consumers do not care. Television did not get worse because people stopped watching; it got worse because they kept watching. The lowest common denominator has always prevailed in ad-supported media because the economics select for it. YouTube’s algorithm does not optimise for quality. It optimises for engagement, and AI slop is engagement-optimised content with the humanity removed. It is the logical terminus of the ad-supported model, not an aberration from it.
Own the building
The question, then, is not how to fix the creator economy. It is what replaces it.
The answer is ownership — of infrastructure, of the audience relationship, of the building itself. Archive of Our Own, the nonprofit fanfiction platform, now hosts over 16.5 million works with more than 10 million registered users. It runs on volunteer labour and donations. It carries no advertising and has no algorithmic feed. It was incorporated as a nonprofit so it could never be sold. One of its founders noted that a for-profit version would have attracted acquisition offers by 2011; they chose otherwise.11 Tumblr, launched around the same time, sold to Yahoo for $1.1 billion and has been declining since. Wikipedia and the Internet Archive operate on the same principle. These are among the most-used institutions on the internet, and they share a common structure: community governance, nonprofit ownership, and funding that aligns the platform’s incentives with its users’ interests rather than with advertisers’. They thrive because nobody profits from their degradation.
In New York, as gentrification advanced, the small businesses that survived were overwhelmingly the ones that owned their buildings. Tenants — no matter how established or profitable — were priced out when landlords tripled rents or sold to developers. A Williamsburg pharmacy that had served its community for decades had to relocate when rent became unaffordable; the owners bought a building on a side street.12 A family appliance store on the Upper West Side, founded in 1934, survived the blackout riots of 1977 but not Manhattan’s commercial rents.13 Ownership was not a guarantee of survival, but tenancy was nearly a guarantee of eventual displacement.
Creators displaced by AI slop are digital tenants. They built on someone else’s land, with someone else’s tools, under someone else’s rules. This was rational when alternatives were expensive and platforms seemed benign. It is no longer rational. The platforms have made clear — through algorithmic design, revenue splits, and enthusiastic adoption of the AI technology that enables the theft — that their interests and the creators’ interests have diverged.
This does not mean abandoning platforms. They remain powerful discovery mechanisms — the equivalent of a shop window on a busy street. But no serious business treats the window as the shop. You use it to attract attention; you transact inside, on your own terms, on your own property.
Nor is ownership a complete answer. Community platforms are hard to build and depend on intrinsic motivation that not every domain can sustain. Subscriptions work for creators with devoted niche audiences but cannot replace the economics that ad revenue once made possible for a broader tier. There is a role for regulation: AI transparency requirements, copyright enforcement that accounts for machine-speed reproduction, platform liability rules that make profiting from stolen content expensive rather than merely embarrassing when journalists notice. The EU’s Digital Services Act gestures in this direction. The United States has done almost nothing.
But the fundamental shift is conceptual. For two decades, the internet’s dominant assumption was that friction was the enemy — that removing barriers between creator and audience would liberate creative work from the old gatekeepers. Platforms spent billions eliminating friction. It turns out that friction — the paywall, the membership gate, the moderated community, the relationship that cannot be replicated by a bot — is what sustains creative work. What is hardest to copy is what endures. What grows in the open gets harvested.
The ad-supported content model is not dying. It is fulfilling its own logic, which was always to reduce the cost of content toward zero while maximising the surface area for advertising. AI has not corrupted this model. It has perfected it. The creators who built livelihoods on that foundation are discovering what sharecroppers have always discovered: the landlord’s investment in your productivity lasts exactly as long as your labour is cheaper than the alternative. The combine harvester has arrived. It does not care about craft, audiences, or years of work. It cares about yield.
References
-
Kapwing. (2025, November). Report on AI-generated content among YouTube’s top channels. ↩
-
Alphabet Inc. (2025). 2024 Annual Report. YouTube advertising revenue reached $36.1 billion, a 14.6% year-over-year increase. ↩
-
Digital Camera World. (2026, February). “YouTube gets tough on AI slop.” Reporting on 16 channels with 35 million combined subscribers and 4.7 billion views removed or wiped. ↩
-
Mohan, N. (2026). YouTube CEO priorities letter, 2026. ↩
-
Movieguide. (2024, August). “Will You Use YouTube’s Sleep Timer?” Reporting on YouTube’s test of a sleep timer feature to limit ad revenue from unconscious viewers. See also SBS News. (2023, December). “Sleep videos are hugely popular — but YouTube adverts are disrupting people’s dreaming.” ↩
-
Scum Mage. (2026). YouTube video describing AI content theft, voice cloning, and cross-platform redistribution. ↩
-
Katsarova, I. & Necsutu, M. (2020). “The Macedonian Fake News Industry and the 2016 US Election.” PS: Political Science & Politics, Cambridge University Press. ↩
-
Techloy. (2025, December). “AI slop is taking over YouTube, and the algorithm is doing exactly what it was built to do.” ↩
-
Carr, N. (2006, December 19). “Digital sharecropping.” Rough Type. https://www.roughtype.com/?p=634 ↩
-
YouTube takes approximately 45% of advertising revenue. Only about 3 million of YouTube’s 115 million channels are monetised under the YouTube Partner Program. (DemandSage, 2026; YouTube official data.) ↩
-
Coppa, F., quoted in “An Archive of Our Own: How AO3 built a nonprofit fanfiction empire.” Organisation for Transformative Works history. ↩
-
Center for New York City Affairs. (2024). “From Bodegas to Boutiques: The Changing Face of Retailing Shows Gentrification’s Effects.” ↩
-
Baker, K. (2018, July). “The Death of a Once Great City.” Harper’s Magazine. ↩