AI Enabled Software Development and Jevons Paradox
Jack Dorsey just laid off 4,000 people at Block. Roughly 40% of the company. His reasoning: AI tools paired with smaller teams enable “a new way of working which fundamentally changes what it means to build and run a company.” He predicted most companies will reach the same conclusion within a year.
This is the popular version of the story. AI makes software development faster, so companies need fewer people and the work shrinks. Software demand is a fixed pie. If slices get cheaper, the pie stays the same size and you just stop baking sooner.
Historically, that is not how efficiency works. Jevons noticed this 160 years ago with coal. Efficiency didn’t conserve the resource, it instead expanded its use.
Why the paradox works
Economists break the rebound into a few flavors:
The strong form, where efficiency gains get overwhelmed by demand expansion, is the paradox.
AI is making software cheaper to produce
To apply Jevons here, you need a credible efficiency improvement. We have that now.
A Google RCT gave ~100 engineers a realistic coding task: developers with AI tools finished 21% faster. GitHub and Accenture measured 4,800 developers and found 55% faster completion on scoped programming tasks, with PR cycle times dropping 75%. GitHub’s CEO has said that Copilot now writes an average of 46% of code in files where it’s enabled, across over 20 million users.
The picture isn’t uniformly rosy. A METR randomized trial found experienced open-source developers were actually 19% slower with AI on complex, real-world tasks in their own repos. The developers themselves thought they were faster. They weren’t. And at the organizational level, most companies using AI tools report no measurable improvement in delivery throughput, even though individual developers feel more productive.
But the Jevons argument doesn’t require AI to make every engineer faster on every task. It requires one thing: that the overall cost of producing working software drops. More code generated per hour, more people able to build, more experiments feasible. The evidence supports that, even when the gains are uneven.
I’ve written about this before. At Pagecloud, we’ve seen it firsthand. The cost of trying things has dropped dramatically. Not incrementally. Dramatically.
When building gets cheaper, you build more
Software demand isn’t fixed. In most organizations, it’s closer to infinite. Software is a lever for automation, customer experience, analytics, operations, compliance, security, product differentiation. There is always more to build than there are people and hours to build it.
So what happens when the cost drops?
More things cross the “worth building” threshold. Every organization has a backlog of ideas that never ship because they’re too expensive, too slow, or too hard to maintain. Cut the cost by 30% and whole categories flip from “never” to “now.” Internal tools that live as spreadsheets forever. One-off workflow automations. Integration glue between systems. Customer-specific customizations that were previously unprofitable. This is direct rebound: cheaper features mean more features get built.
Faster iteration increases ambition, not rest. If you can ship twice as fast, you don’t ship the same thing and go home early. You run more experiments. You try more product variants. You localize into more markets. You instrument more analytics. The organization discovers that software is a bottleneck it can finally afford to attack, so it attacks harder.
The builder population expands. Efficiency doesn’t only speed up existing developers. It expands who can build. Low-code platforms already showed this: lower the skill and time barriers and more people start building. AI pushes that further. The interface becomes conversation and intent, not just syntax. More builders means more software.
AI itself creates demand for more software. Every AI feature shipped needs integration, governance, testing, observability, security, and change management. Even if “writing code” gets cheaper, “running software in the real world” becomes more central. Gartner’s latest forecast projects worldwide IT spending will hit $6.15 trillion in 2026, up 10.8% year-over-year, with AI as the dominant growth driver. That’s more software work, not less.
We’ve been here before
This isn’t theoretical. Computing has already demonstrated the pattern.
Computing efficiency has improved dramatically over decades. Koomey’s Law observes that the number of computations per kilowatt-hour of energy has doubled roughly every 1.6 years since the 1950s. That’s a trillionfold improvement in efficiency over six decades.
And yet total computing energy consumption hasn’t dropped. Efficiency improvements are necessary but not sufficient for reducing total resource use, because demand grows as compute becomes cheaper and more available. This is Jevons Paradox in action, decades before anyone applied it to software.
Marc Andreessen wrote “Why Software Is Eating the World” in 2011. That was before cloud computing was mainstream, before mobile-first, before AI. Every wave of efficiency has expanded software’s footprint rather than contracting it. Each time people predicted saturation. Each time the cheaper it got to build, the more we built.
The pattern is familiar: efficiency enables scale. The same dynamic is playing out with software development itself.
The objections
“Budgets are fixed, so demand can’t expand forever.” Budgets move when ROI improves. When software becomes cheaper and faster, projects that previously had negative ROI become positive. That unlocks new budget, especially when competitors are accelerating too. Spending shifts rather than disappearing: less on routine coding, more on architecture, security, data quality, and evaluation.
“Companies are already cutting. Look at Block.” They are. But look closer. Block grew from 3,800 to over 10,000 employees during the pandemic, hiring aggressively for a year longer than its peers. An Oxford Economics report found that many AI-attributed layoffs are actually corrections for pandemic-era overhiring. HBR research surveying a thousand executives found that 60% made headcount reductions in anticipation of AI, while only 2% cut based on actual implementation results. Companies are laying off based on AI’s potential, not its performance. Klarna cut its workforce citing AI, then admitted the strategy produced lower quality and started reinvesting in human support. IBM is now tripling its entry-level hiring after finding the limits of AI adoption. As Wharton professor Ethan Mollick put it: “it is hard to imagine a firm-wide sudden 50%+ efficiency gain” from tools this new. Meanwhile, the BLS still projects 15% growth for software developers from 2024 to 2034. The reshaping looks Jevons-consistent: some roles compress, total demand for software work expands.
“Efficiency sometimes really does reduce consumption.” True. The magnitude of rebound varies. The claim isn’t that demand expansion is guaranteed in every niche. It’s that the conditions for a Jevons-style expansion in software are strong: demand is elastic, the resource (developer time) is a bottleneck in nearly every organization, and the efficiency gains are significant.
The sharp edge
If output rises faster than our ability to manage what we’ve built, we get a different kind of rebound backfire. Not just more software, but more software than our organizations can safely operate, secure, and maintain.
I’ve written about taste in code and about how junior developers build judgment. Both become more important in this world. When throughput rises, the cost of bad decisions compounds faster. My friend Mahdi Yusuf captures this well: he built more in two months with AI agents than in the previous year, and used almost none of it. As he puts it, “agents amplify clarity. If you know exactly what you want, they’re a force multiplier. If you don’t, they’re an expensive way to generate garbage you’ll never use.”
The winners aren’t the teams that just code faster. They’re the teams that channel increased output into outcomes without drowning in complexity:
- Stronger prioritization. Cheaper shipping creates more options. Saying “no” becomes more valuable, not less. Building the wrong thing faster is not progress.
- Quality systems that scale. Tests, code review, CI, staged rollouts. All of these matter more when throughput rises.
- Security as a first-class constraint. More shipped software is a larger attack surface. AI accelerates both builders and attackers.
- Architecture as leverage. If output rises faster than maintainability, you’re building a maintenance tax that compounds.
AI may reduce the cost of writing software, but it increases the payoff of managing software well. I suspect we will see (and already are seeing in some cases) tailwinds for software companies that help manage and organize this complexity.
The lesson
AI is changing software development the way better steam engines changed coal consumption: by lowering the effective cost of a powerful input. Jevons’ lesson is that efficiency often expands demand.
In software, that likely means more applications, more features, more experiments, and more custom solutions than we would have built otherwise. The risk isn’t that we’ll run out of work. The risk is that we’ll create more software than we can safely operate and maintain.
Jevons Paradox isn’t a reason to fear AI. It’s a reason to aim it carefully.