Beyond the Artifact: The Brutal Economics of Liquid Content
Value is migrating away from content, and creating surprising new opportunities
There’s something huge missing in the conversation around AI-mediated information flows and liquid content. I’m not talking about hand-waving over long-term societal impacts and the death of shared reality (though that is actually really important). I’m talking about something much more near-term and pragmatic: unit economics.
The cold reality is that when content becomes infinitely replicable and reformattable at near-zero marginal cost, the economic value of any individual piece approaches zero. I believe this will be the major forcing function that drives structural change in the existing news media industry. In an AI-mediated information ecosystem, few existing publications will be able survive on current business models. New upstarts are already emerging, building businesses around capabilities and infrastructure, changing the competitive environment. News media organizations face a stark choice: radically innovate or risk extinction.
The Commoditization of Content
The shift to a B2A2C flow of information, where content becomes ‘liquid’, has a clear near-term consequence: the commoditization of content.
When the same information can instantly reshape itself across formats and platforms, when ChatGPT can transform a Wall Street Journal investigation into a children’s story or a policy brief in seconds, the economic value of any individual article approaches zero. If users can get the information they need from AI summaries, there is much less reason to visit the original site, view its ads, or maintain subscriptions. The scarcity that once justified paywalls and premium pricing migrates elsewhere as content becomes infinitely liquid. Owning buckets (websites, publications) becomes less economically viable than owning pipes (distribution, verification, sense-making systems).
This commoditization will start to bifurcate the market into a spectrum with extremes at either end: a smaller premium market at one end, and a larger commodity market at the other. A classic barbell effect where the middle disappears. On one end, premium offerings compete on brand trust, curation, and direct relationships; think FT, NYT, or successful Substack writers. High margins, smaller audiences, competing on uniqueness. On the other end, commodity information providers operate at massive scale with razor-thin margins; Bloomberg, Reuters, Notebook LM and Perplexity are already there, operating as infrastructure more than traditional publishers.
Most current publishers won’t be able to compete with AI on scale and efficiency on the commodity end, and only a few have sufficient brand differentiation to command premium prices at the other. The middle ground, where most newsrooms currently sit, is likely to evaporate. Most publishers will likely end up migrating to the premium end (if they have enough of a differentiated offering) or get commoditized out of existence.
Incrementalism and the Death Spiral
At present, news media trying to adapt to AI are pursuing two main strategies:
Incremental innovation: Applying AI to existing workflows, such as automated transcription, AI generated first drafts, summarisation, multi-modal transformation. This is really efficiency-seeking or cost reduction expressed as innovation; making the same artifacts cheaper and faster, or making greater quantities of them.
Content defense: Protecting content ownership rights through bilateral deals with AI companies, seeking compensation for use of materials, collective bargaining, or regulatory advocacy.
While this is all necessary, it is insufficient.
Pushing the behaviour of an incremental innovation strategy to the limit, its paradoxical nature becomes apparent: media organizations might end up using AI to create content more efficiently, optimize it for AI discovery, then watch as AI systems extract the value and commodify that content. If and only if they’re lucky, they’ll get some compensation. But mostly not.
The FT Strategies report published in November maps out four potential quadrants for publishers, which zone in on the extreme ends of the barbell, but over the long term, these positions are unsustainable for the majority of publishers in their current cost and operating structures. Take for example the commodity (“embedded distribution”) quadrants. The unit economics here are brutal: if production costs drop 80% but revenue per piece also drops 90%, publishers need 10x the volume just to break even. They will likely need massive scale to cover those margins. What’s more, creating more content in an already oversupplied market only accelerates commoditization. They’re feeding the beast that’s eating them while racing it to the bottom on price. On the other end (“direct distribution”), if publishers start to migrate up the value chain en masse, each trying to compete for direct audience relationships, this end of the market becomes saturated, and those that are insufficiently differentiated are forced out.
While these strategies might be sensible – and even necessary – as short-term survival, in the long run, incrementalism in this changing ecosystem ultimately becomes a death spiral. Only organizations with massive scale or premium brand differentiation can survive these economics. Without radical reinvention, it becomes a managed decline.
Changing the Rules
There’s a business truism that says: you can’t cut your way to growth, you have to create new value.
In journalism’s case, this means recognizing that value creation doesn’t come from producing content more efficiently, or making more of it. Real value will need to come from creating entirely new products (which really means new forms of IP disaggregated from content), finding new distribution models, or expanding the total addressable market. Publishers will need growth, not optimization. They need to build something people will pay for that they couldn’t get before, not make the same thing cheaper.
But the paradox is that in an AI-mediated information ecosystem, while journalistic artifacts become commodified, journalistic processes, like truth-seeking, accountability, and sense-making, arguably become more valuable than ever, both economically and socially.
Something I recently experienced validated this anecdotally. At a founders’ breakfast in San Francisco last month, in a room full of AI startups all building products to disrupt various industries, it was evident that they still had “get featured in TechCrunch” or “land a WSJ profile” at the top of their media leverage wish list. These companies, who are architecting the ecosystem that’s supposedly replacing traditional media, still desperately need traditional media’s legitimacy-conferring power to succeed.
So while the artifact—the article—will likely still matter as a signal, or as Ben Thompson puts it, a flag, the real power, the actual value, lies elsewhere: in the ability to confer legitimacy, to determine what matters, to independently seek truth and continuously correct the record.
So what if news media were to let go of the artifact as the product and productize the process instead?
From Artifacts to Capabilities
This isn’t entirely new or theoretical. The market has already demonstrated that there is a way to capture value in the process. Many publishers, such as The Information, already earn revenue through events; Tortoise Media’s ThinkIns essentially sell the editorial meeting process itself as a product where members participate in news judgment; Bellingcat monetizes their OSINT investigation methodology through workshops and tools.
But why not push this further, take an even more radical approach, and rethink the whole journalism value chain from first principles? When software moved from packaged goods to cloud services, companies stopped selling boxes of code and started monetising capabilities. Journalism needs a similar shift, while maintaining its ethical core.
Existing capabilities are already being unbundled. AppliedXL, a startup founded by a former WSJ executive, uses computational journalism algorithms to detect newsworthy events in healthcare and finance data streams; not creating content, but selling the ability to identify what matters before it becomes news. Full Fact and NewGuard have built verification infrastructure. Even individual journalists are getting in on this: Sophia Smith Galer’s Sophiana app packages her years of social video expertise into a tool that helps other journalists transform articles into algorithm-ready videos.
Undermonetized capabilities remain ripe for development. The signal value of journalistic vetting (the above example where startups wants media coverage for validation, not traffic) has no business model. Accountability systems lack sustainable revenue. Hunterbrook Media, an experimental hybrid media outlet and activist hedge fund, has taken this logic to its extreme: their hedge fund arm literally trades on their journalism before publication, monetizing investigation as market intelligence. While ethically controversial, it proves journalism’s capabilities have massive economic value when unbundled from articles.
But the real opportunity lies in entirely new value functions. As I’ve explored in previous writing, AI enables capabilities that weren’t possible in traditional journalism. The “intimacy dividend“, people’s apparent willingness to ask AI questions they’d never ask humans, could transform how audiences process complex or emotionally charged news. Imagine products or experiences where users can privately explore their confusion without social embarrassment, or work through cognitive dissonance when confronted with information that challenges their heuristics.
In a different direction, collective intelligence platforms could transform journalism from single investigation to networked sense-making. Many outlets have tried this manually already; Correctiv’s CrowdNewsroom has mobilized thousands of readers to contribute to investigations, ProPublica’s Local Reporting Network pools resources across newsrooms. But what if this became automated infrastructure? Prediction markets already aggregate collective knowledge about future events. The capabilities exist and are already semi systematised, so it could be deployed at scale, turning audiences from consumers into participants in the truth-seeking process.
Ultimately, the opportunity is to stop seeing these capabilities just as byproducts of article production and start experimenting with them as standalone products, services, and infrastructure for an AI-mediated world.
Towards A First Principles Definition
The paradigm shifts driven by AI aren’t just transforming journalism, it’s affecting the entire information ecosystem, and as a consequence is likely to collapse the boundaries between all traditional information verticals. Education, research, entertainment, journalism: these professional categories made sense when distribution channels were scarce and expertise was gatekept. But in an AI-mediated ecosystem where any content can be instantly transformed, translated, and recontextualized, these distinctions become increasingly arbitrary.
This isn’t to say that professional distinctions served no purpose. Press credentials, source protection laws, and journalistic privileges evolved for good reasons; they created a framework for accountability and protected the public’s right to know. These social norms remain critically important. But they also need redefinition, when an algorithm can perform investigative analysis, a citizen with a smartphone can document breaking news, and a data scientist can expose corruption through computational methods.
This means journalism itself needs to be redefined from first principles, based on its own values and the benefit it provides to audiences and society rather than who traditionally performed it or what format it took. From the end user’s and society’s perspective, the value-add was never about the artifact, it’s about the verified information, accountability mechanisms, and sense-making capabilities that the artifact contained.
It took nearly a decade for the journalism establishment to accept that newsletter writers and podcast hosts might also be “real journalists.” Now, as AI forces an even more radical restructuring of the information landscape, the industry can either spend another decade debating professional boundaries or recognize that journalism is a methodology, not a monopoly. A broader definition also means more models to learn from, more potential allies, and more paths to sustainability.
The irony is that the things journalism does best, building trust, verifying truth, making sense of complexity, are exactly what an AI-saturated information world needs most. But these capabilities have always been bundled into articles, their value buried in the artifact rather than recognized as standalone assets. The industry needs to stop conflating the container with the contents and start finding ways to leverage these intangible assets.



I thought provoking read. The decline of journalism and the proliferation of synthetic content, actually force creators to diversify. Sadly often without journalistic integrity while former journalists lock the business acumen to successfully make the transition. There is no abundance here, but the weakening of democracy and integrity is highly noticeable. Journalism was already in decline before generative AI arrived, but the new internet is one for opportunists and hackers often who scale distribution in nefarious ways. A master's in journalism does not prepare you for this New Media landscape.
Wow, a lot to unpack. This might be the most accurate analysis of current market challenges for news orgs I’ve read in a long time.
The monetization of journalism functions is very interesting but I’m skeptical of your examples. I love Sophiana but I don’t really see how this could become a significant revenue driver. I’m also skeptical of AppliedXL’s ability to compete with big AI companies for trend detection in the long run.
What seems more promising to me is to monetize "support" (rather than monetizing the artifact) for news orgs that have a very clear purpose and which focus on the investigative function (which will not be replaced by AI even if it can help in the investigative process).
We already have a very successful blueprint of this in France: Mediapart, a digital publication which broke every major political scandal in France in the last 15 years. They managed to convince 250k+ people to subscribe monthly even though most don’t read the website. Monetizing support in investigative journalism can work, but we will still be dependent on AI systems to reach the end user. That worries me in both economical and political terms.