Loading Now

Solow’s paradox has come to haunt AI adoption

Solow’s paradox has come to haunt AI adoption

Solow’s paradox has come to haunt AI adoption


AI enthusiasts, beware: predictions that the technology will suddenly boost productivity eerily echo those that had followed the introduction of computers to the workplace. Back then, we were told that the miraculous new machines would automate vast swathes of white-collar work, leading to a lean, digital-driven economy.

Fast forward 60 years, and it’s more of the same. Shortly after the debut of ChatGPT in 2022, researchers at the Massachusetts Institute of Technology claimed employees would be 40% more productive than their AI-less counterparts.

These claims may prove to be no more durable than the pollyannish predictions of the Mad Men era. A rigorous study published by the National Bureau of Economic Research in May found only a 3% boost in time saved, while other studies have shown that reliance on AI for high-level cognitive work leads to less motivated, impaired employees.

Also Read: Colleagues or overlords? The debate over AI bots has been raging but needn’t

We are witnessing the makings of another ‘productivity paradox,’ the term coined to describe how productivity unexpectedly stagnated and, in some cases, declined during the first four decades of the information age. The bright side is that the lessons learned then might help us navigate our expectations in the present day.

The invention of transistors, integrated circuits, memory chips and microprocessors fuelled exponential improvements in information technology from the 1960s onward, with computers reliably doubling in power roughly every two years with almost no increase in cost.

It quickly became an article of faith that computers would lead to widespread automation (and structural unemployment). A single person armed with the device could handle work that previously required hundreds of employees. Over the next three decades, the service sector decisively embraced computers.

Yet, the promised gains did not materialize. In fact, studies from the late 1980s revealed that the services sector—what economist Stephen Roach described as “the most heavily endowed with high-tech capital”—registered the worst productivity performance during this same period. 

In response, economist Robert Solow had famously quipped that “we see computers everywhere except in the productivity statistics.”

Also Read: AI didn’t take the job. It changed what the job is.

Economists advanced multiple explanations for this puzzle (also known as ‘Solow’s Paradox’). Least satisfying, perhaps, was the claim, still made today, that the whole thing was a mirage of mismeasurement and that the effects of massive automation somehow failed to show up in the economic data.

Others have argued that the failure of infotech investments to live up to the hype can be laid at the feet of managers. There’s some merit to this argument: studies of infotech adoption have shown that bosses spent indiscriminately on new equipment, all while hiring expensive workers charged with maintaining and constantly upgrading these systems. Computers, far from cutting the workforce, bloated it.

More compelling still was the ‘time lag’ hypothesis offered by economist Paul A. David. New technological regimes, he contended, generate intense conflict, regulatory battles and struggles for market share. Along the way, older ways of doing things persist alongside the new, even as much of the world is remade to accommodate the new technology. None of this translates into immediate efficiency—in fact, quite the opposite.

As evidence, he cited the advent of electricity, a quicker source of manufacturing power than the steam it would eventually replace. Nonetheless, it took 40 years for the adoption of electricity to lead to increased worker efficiency. Along the way, struggles to establish industry standards, waves of consolidation, regulatory battles and the need to redesign every single factory floor made this a messy, costly and prolonged process.

Also Read: Devina Mehra: Yes, AI is coming for our jobs; it’s the old story of new tech

The computer boom would prove to be similar. These complaints did not disappear, but by the late 1990s, the American economy finally showed a belated uptick in productivity. Some economists credited it to the widespread adoption of information technology. 

Better late than never, as they say. However, efficiency soon declined once again, despite (or because of) the advent of the internet and all the other innovations of that era.

AI is no different. The new technology will have unintended consequences, many of which will offset or even entirely undermine its efficiency. That doesn’t mean AI is useless or that corporations won’t embrace it with enthusiasm. Anyone expecting an overnight increase in productivity, though, will be disappointed.  ©Bloomberg

The author is professor of history at the University of Georgia and co-author of ‘Crisis Economics: A Crash Course in the Future of Finance’.

Post Comment