Live
Black Hat USADark ReadingBlack Hat AsiaAI Business90 Autonomous Runs: What an AI Agent Society Actually Looks LikeDEV CommunityWhat is an MCP proxy and why does it need an approval layer?DEV CommunityAI subscriptions are subsidized. Here's what happens when that stops.DEV CommunityI Built a Multi-Agent AI Runtime in Go Because Python Wasn't an OptionDEV CommunityThe Documentation Attack Surface: How npm Libraries Teach Insecure PatternsDEV CommunityEveryone's Building AI Agents. Nobody's Building What Makes Them Work.DEV CommunityBillionaire Philippe Laffont Sold CoreWeave and Bought This Artificial Intelligence (AI) Stock Instead - The Motley FoolGoogle News: AITSMC vs. Nvidia: Which AI Supercycle Growth Stock Is the Better Long-Term Buy? - AOL.comGNews AI NVIDIAQodo Merge Review: Is AI PR Review Worth It?DEV CommunityGenerative AI Business Use Cases 2026: The 11 Applications Delivering Real ROI - BBN TimesGoogle News: Generative AIUnderstanding Attention Mechanisms – Part 6: Final Step in DecodingDEV CommunityTSMC vs. Nvidia: Which AI Supercycle Growth Stock Is the Better Long-Term Buy? - Yahoo FinanceGNews AI NVIDIABlack Hat USADark ReadingBlack Hat AsiaAI Business90 Autonomous Runs: What an AI Agent Society Actually Looks LikeDEV CommunityWhat is an MCP proxy and why does it need an approval layer?DEV CommunityAI subscriptions are subsidized. Here's what happens when that stops.DEV CommunityI Built a Multi-Agent AI Runtime in Go Because Python Wasn't an OptionDEV CommunityThe Documentation Attack Surface: How npm Libraries Teach Insecure PatternsDEV CommunityEveryone's Building AI Agents. Nobody's Building What Makes Them Work.DEV CommunityBillionaire Philippe Laffont Sold CoreWeave and Bought This Artificial Intelligence (AI) Stock Instead - The Motley FoolGoogle News: AITSMC vs. Nvidia: Which AI Supercycle Growth Stock Is the Better Long-Term Buy? - AOL.comGNews AI NVIDIAQodo Merge Review: Is AI PR Review Worth It?DEV CommunityGenerative AI Business Use Cases 2026: The 11 Applications Delivering Real ROI - BBN TimesGoogle News: Generative AIUnderstanding Attention Mechanisms – Part 6: Final Step in DecodingDEV CommunityTSMC vs. Nvidia: Which AI Supercycle Growth Stock Is the Better Long-Term Buy? - Yahoo FinanceGNews AI NVIDIA
AI NEWS HUBbyEIGENVECTOREigenvector

Compute Curse

LessWrong AIby Ihor KendiukhovApril 4, 20268 min read0 views
Source Quiz

Epistemic status: romantic speculation. The core claim: I accidentally thought that compute growth can be rather neatly analogized to natural resource abundance. Before compute curse, there was resource curse Countries that discover oil often end up worse off than countries that don't, which is known as the resource curse . The mechanisms are well-understood: a booming resource sector draws capital and labor away from other industries, creates incentives for rent-seeking over productive investment, crowds out human capital development, and corrodes the institutions needed to sustain long-term growth. I argue that something structurally similar has been happening with compute. The exponential growth of available computation over the past several decades, and, critically, the widespread expe

Epistemic status: romantic speculation.

The core claim: I accidentally thought that compute growth can be rather neatly analogized to natural resource abundance.

Before compute curse, there was resource curse

Countries that discover oil often end up worse off than countries that don't, which is known as the resource curse. The mechanisms are well-understood: a booming resource sector draws capital and labor away from other industries, creates incentives for rent-seeking over productive investment, crowds out human capital development, and corrodes the institutions needed to sustain long-term growth.

I argue that something structurally similar has been happening with compute. The exponential growth of available computation over the past several decades, and, critically, the widespread expectation that this growth would continue, has created a pattern of resource allocation, talent distribution, and research prioritization that mirrors the resource curse in specific and non-metaphorical ways.

Note: this is not a claim that extensive compute growth has been net negative (neither it is the opposite claim).

Dutch disease comes for ASML

The original Dutch disease mechanism is straightforward: when a booming sector (say, natural gas extraction) generates high returns, it pulls capital and labor out of other sectors (say, manufacturing), causing them to atrophy. The non-booming sectors don't decline because they became less valuable in absolute terms but rather because the booming sector offers relatively better returns, and resources flow accordingly.

A trivial version of "compute Dutch disease" of it goes like this: because scaling compute yields such reliable, legible, and fundable returns (train a bigger model, get a better benchmark score, publish the paper, raise the round), it systematically starves research directions that are harder to fund, harder to evaluate, and slower to produce results, even when those directions might be more consequential in the long run.

So, "The Bitter Lesson" can be seen as the Dutch disease of AI research, if we add to it that the fact that scaling works doesn't mean the crowding-out of alternatives is costless. Or, in other words, the fact that scaling works better is rather a fact about our ability to do programming or even, if we go to the very end of this line of reasoning, about our economic and educational institutions, than about computer science in general.

However, I consider it only as the most recent and prominent manifestation of a phenomenon that was happening for decades.

Since at least the late 1990s, the reliable cheapening of compute has made it consistently more profitable to build compute-intensive solutions to problems than to invest in the kind of deep, careful engineering that produces efficient, well-understood systems. When you can always count on next year's hardware being faster and cheaper, the rational business decision is to ship bloated software now and let Moore's Law clean up after you, rather than spending the additional engineering time to make something lean and correct. This created an enitre economy of applications, business models and platform architectures that are, in a meaningful sense, the technological equivalent of an oil-dependent monoculture: they exist not because they represent the best way to solve a problem, but because abundant compute made them the cheapest way to ship a product.

The consequences are visible across the entire stack. Web applications that would have run comfortably on a 2005-era machine now require gigabytes of RAM to render what is essentially styled text. Electron-based desktop apps ship an entire browser engine to display a chat window. Backend services that could be handled by a well-designed program running on a single server are instead distributed across sprawling microservice architectures that consume orders of magnitude more compute. Cory Doctorow's "enshittification" framework is about the user-facing result of this dynamic, but the deeper structural story is about how compute abundance degraded the craft of software engineering itself, well before anyone started worrying about ChatGPT replacing programmers.

This is the Dutch disease pattern operating at the level of the entire technology economy: the booming sector (scale-dependent applications) drew capital and talent away from the non-booming sector (careful engineering, deep technical innovation, computationally parsimonious approaches, and overall hardware tech economy - biotech, spacetech, materials, etc.), and the non-booming sector atrophied accordingly.

Because of the advantage of huge compute available, it got more financially attractive to allocate resources towards software than towards physical engineering and deeptech, on top of software being easier to update, replicate, diffuse, and make incremental improvements on. And so deeptech stagnated.

But of course the AI case is qualitatively different and the most sorrowful because it resulted in humanity trying to build superintelligence with giant instructable deep learning models.

Human capital crowding-out

Resource curse economies characteristically underinvest in education and human capital development. The relative returns to education are lower in resource-dependent economies because the booming sector doesn't require a broadly educated population.

The compute version of this story has been playing out for at least a decade, well before the current discourse about AI replacing jobs and destroying university education. The entire trajectory of computer science education shifted from "understand the fundamentals deeply" toward "learn to use frameworks and APIs that abstract over compute." At the same time, natural sciences and engineering education got increasingly less attractive and rewarding as compared to computer science.

There is also a more direct talent-siphoning effect: the IT economy has been pulling the most capable technical minds into a narrow set of activities and away from a much broader set of technical and scientific challenges.

The voracity effect and race dynamics

In the resource curse literature, there is a so called "voracity effect": when competing interest groups face a resource windfall, they respond by extracting more aggressively, leading to worse outcomes than moderate scarcity would produce. Rather than investing the windfall prudently, competing factions race to capture as much of it as possible before others do.

I leave this without a direct comment and let the reader have their own pleasure of meditating on this.

But compute growth is endogeneous!

The resource curse in its classical form operates on an exogenous endowment: countries don't choose to have oil reserves, they discover them, and then the political economy warps around that windfall. Much of the pathology comes from the unearned nature of the wealth: it enables rent-seeking, weakens the link between effort and reward, and corrodes institutions.

Compute, by contrast, is endogenously produced through deliberate R&D and engineering investment. Moore's Law was never a law of nature.

Right?

I mean, to me Moore’s Law looks like a strong default of any humanity-like civilization. It is created by humans, right, but it is created in a kind of hardly avoidable manner.

The counterfactual question

The resource curse literature has natural counterfactuals (resource-poor countries that developed strong institutions and diversified economies: Japan, South Korea, Singapore). What's the compute-curse counterfactual? A world where compute grew more slowly and we consequently invested more in elegant algorithms, interpretable models, and formal methods?

It's plausible, but it's also possible that slower compute growth would have simply meant less progress overall rather than differently-directed progress. I don’t know. I said in the beginning - it is a speculation.

However, one can trivially note that in a world with less compute abundance, the relative returns to algorithmic cleverness, interpretability research, and formal verification would have been higher, because you couldn't just solve problems by throwing more FLOPS at them. And that may or may not lead to better outcomes in the long run (I am basically leaving here the question of ASI development and just talking about rather “normal” tech and science R&D).

People actually thought about this!

Two existing frameworks are close to what I'm describing, but both point the analogy in different directions.

The Intelligence Curse (Luke Drago and Rudolf Laine, 2025) uses the resource curse analogy to argue that AGI will create rentier-state-like incentives: powerful actors who control AI will lose their incentive to invest in regular people, just as petrostates lose their incentive to invest in citizens. This is a compelling argument about the distributional consequences of AGI, but it's about what happens after AGI arrives. The compute curse is about what's happening now, during the process of building toward AGI, and about how the abundance of compute is distorting that process itself.

The Generalized Dutch Disease (Policy Tensor, Feb 2026) is about the macroeconomic effects of the compute capex boom on US manufacturing competitiveness, showing that it operates through the same channels as the fracking boom and the pre-2008 financial boom. This is the closest existing work to what I'm describing, but it stays within the macroeconomic framing (factor prices, unit labor costs, exchange rate effects) and doesn't address the innovation-direction distortion, human capital crowding-out in the intellectual sense, or the AI safety implications.

But: compute curse may actually be worse than resource curse

Some of the negative downstream effects of compute abundance don't map onto the resource curse framework directly but are worth including for completeness, since they stem from the same underlying cause (cheap, abundant compute enabling activities that wouldn't otherwise be viable):

  • Social media and attention economy pathologies
  • Surveillance infrastructure
  • Targeted public opinion manipulation
  • And of course many AI safety issues

These are not Dutch disease effects, just straightforward negative externalities of cheap compute. But they suggest that the full accounting of compute abundance's costs is substantially larger than what the resource curse analogy alone would tell.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

modelbenchmarkavailable

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Compute Cur…modelbenchmarkavailableversionproductapplicationLessWrong AI

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 196 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Products