AI and the Productivity Paradox

Why the robots haven’t saved your Monday (yet)

If you scanned headlines alone, you might think we are living through the dawn of a new productivity renaissance. Every corporate deck now has an “AI strategy” slide, GPU factories are running flat out, and the vocabulary of consultants has been upgraded from “digital transformation” to “agentic workflows”. It is easy to get swept up in the feeling that something monumental is underway. Yet when you turn away from the noise and open the productivity data, the picture is frustratingly flat. Output per worker in most advanced economies is still crawling along. For a technology hyped as revolutionary, AI is remarkably absent where it matters most: the numbers.

This tension is familiar. In 1987, Robert Solow famously remarked that “you can see the computer age everywhere but in the productivity statistics”. Four decades later, we could simply swap “computer” for “AI” and arrive at the same puzzle. But the more historical lens you apply, the less surprising the discrepancy becomes. Transformative technologies often spend years—sometimes decades—creating excitement long before they move the macroeconomic dial.

Electricity is the clearest example. Today it feels like the most obvious turning point in industrial history: the light bulb, the grid, the electrified city. But it did not meaningfully boost productivity for almost forty years. In the late 19th century, when electricity first spread, most factories simply replaced steam engines with electric motors and left everything else unchanged. The workflows were the same, the layouts were the same, the bottlenecks were the same. Only when managers redesigned factories around electricity—introducing smaller motors to power individual machines, building single-storey plants, reorganising tasks, improving lighting and safety—did productivity finally surge. The technology was ready long before society was. The lag was not an accident; it was part of the adoption curve.

Computing followed a similar script. By the 1980s, computers were everywhere. What was not everywhere was productivity growth. Companies invested heavily in hardware and software, but none of it seemed to matter in the aggregate numbers. The payoff only arrived in the mid-1990s, when certain industries restructured themselves around digital tools. Retail revamped inventory management with barcodes and real-time systems. Logistics became more precise with tracking and routing technology. Finance reinvented itself through electronic workflows. Manufacturing used CAD and automation to redesign entire processes. As with electricity, the gains came not from the mere existence of the technology but from the messy, organisational rewiring that allowed it to create value.

Seen this way, AI today looks less like a paradox and more like the opening act of a familiar story. At the micro level, the benefits are already visible. Software engineers using AI assistants complete tasks faster and learn new frameworks more quickly. Customer service teams handle interactions more efficiently with automated summarisation and drafting. Legal and research teams sift through documents in a fraction of the time. Individual productivity is rising in pockets across the economy. But at the macro level—where national statistics condense millions of workers into a single number—very little has budged.

Part of the issue is measurement. Productivity appears, on paper, to be a simple ratio between output and input. In practice, both sides of the equation are difficult to quantify, especially when it comes to knowledge work. Many of AI’s early benefits improve the quality of output rather than the visible quantity. Better code, better decisions, and better reasoning often do not translate neatly into measurable production. Meanwhile, the costs of adoption—training teams, building data infrastructure, integrating systems—appear upfront and immediately, long before the benefits accumulate. Economists sometimes call this the J-curve of technological adoption: the initial costs are visible, while the gains only show up later, and often very unevenly.

Another part of the story is that general purpose technologies require complementary investments that are slow, expensive, and organisationally painful. Electricity needed redesigned factories. Computers needed reimagined workflows. AI needs good data, new decision structures, and a cultural shift in how organisations work. Buying the model is the easy part. Redesigning jobs, responsibilities, incentives, and governance around the technology is the hard and slow part. Only a minority of firms will make this transition well. Those that do will pull far ahead; those that don’t will wonder why the technology “failed”.

A third reason for the lag lies in the structure of the economy itself. AI currently moves fastest in digital and text-heavy domains—software, marketing, customer support, administrative work. But many of the world’s largest sectors are human-intensive, physical, and regulated: healthcare, construction, education, hospitality, government services. These sectors employ millions of workers and change far more slowly. Even if AI produces dramatic gains in the “fast lane,” the “slow lane” dominates the aggregate, muting the early impact. This pattern is the same one that delayed the visible effects of electricity and computing. The leading adopters sprint ahead, the rest of the economy jogs, and the national statistics barely move until the diffusion is much broader.

For investors and policymakers, the temptation is always to draw a straight line from technological excitement to economic transformation. History suggests a different arc: early enthusiasm, mid-cycle disappointment, and eventually a long, sustained period of gains once the organisational and social complements are in place. Electricity had speculative booms and deep troughs. Railways had manias and crashes. The early internet famously produced one of the largest bubbles in modern history. Yet in each case, the long-run reality eventually exceeded the early hype—but only after years of trial, error, retraining, and reorganisation.

There is a philosophical thread embedded here as well. New technologies often provoke a kind of fatalism—either the robots will take all our jobs, or they will rescue us from stagnation. In both versions, human agency fades. But the historical record is clear: technology only becomes productive through human choices. Electricity did not automatically make societies richer; managers and workers who redesigned their environments did. Computing did not automatically increase productivity; companies that embraced new methods of working saw the gains. Marcus Aurelius would have recognised the pattern: the only thing within our control is how we respond. Applied to AI, that means using the tools to deepen our understanding rather than replace it, training ourselves to work alongside the technology rather than leaning on it as a crutch, and building the habits required to turn leverage into compounding rather than noise.

Looking ahead, it is reasonable to expect AI’s economic impact to follow the same slow-burn trajectory as earlier foundational technologies. The real transformation will come not from flashy demos, but from gradual changes in how organisations structure work, how individuals use their time, and how industries adapt their processes. The cost of cognitive labour will fall, new categories of products and services will appear, and the gap between adaptive and non-adaptive firms will widen. But the macro payoff—broad-based, sustained productivity growth—will take time. And it may even take long enough that future commentators will still complain that “AI hasn’t shown up in the numbers yet,” even as certain sectors become almost unrecognisable.

The productivity paradox is less a riddle to be solved and more a recurring stage in the life cycle of every great technology. Electricity had one. Computing had one. Railroads had one. AI is simply having its own. The trick is not to expect an overnight revolution, but to recognise that the real work—organisational, behavioural, and cultural—has only just begun.

Next
Next

The Art of Doing Nothing in Investing: Why Patience Is the Highest Form of Action