Innovations like cloud computing and artificial intelligence are hailed as engines of a coming productivity revival. But a broad payoff across the economy has been elusive according to Steve Lohr, writing in the New York Times
For years, it has been an article of faith in corporate America that cloud computing and artificial intelligence will fuel a surge in wealth-generating productivity. That belief has inspired a flood of venture funding and company spending. And the payoff, proponents insist, will not be confined to a small group of tech giants but will spread across the economy.
It hasn’t happened yet.
Productivity, which is defined as the value of goods and services produced per hour of work, fell sharply in the first quarter this year, the government reported this month. The quarterly numbers are often volatile, but the report seemed to dash earlier hopes that a productivity revival was finally underway, helped by accelerated investment in digital technologies during the pandemic.
The growth in productivity since the pandemic hit now stands at about 1 percent annually, in line with the meager rate since 2010 — and far below the last stretch of robust improvement, from 1996 to 2004, when productivity grew more than 3 percent a year.
Economies grow not only by adding more capital and labor. Another vital ingredient is a nation’s skill in creating and commercializing innovation, which makes investment and workers more productive.
Productivity is not a cure-all for economic ills.
“Even if the optimism about this wave of digital technology proves justified, that does not mean there will be a real sharing of the benefits,” said Laura Tyson, a professor at the Haas School of Business at the University of California, Berkeley, and a chair of the Council of Economic Advisers in the Clinton administration.
But a less productive economy is a smaller one with fewer resources to deal with social challenges like inequality.
The current productivity puzzle is the subject of spirited debate among economists.
Robert J. Gordon, an economist at Northwestern University, is the leading skeptic. Today’s artificial intelligence, he says, is mainly a technology of pattern recognition, poring through vast troves of words, images and numbers. Its feats, according to Mr. Gordon, are “impressive but not transformational” in the way that electricity and the internal combustion engine were.
Erik Brynjolfsson, director of Stanford University’s Digital Economy Lab, is the leader of the optimists’ camp. He confesses to being somewhat disappointed that the productivity pickup is not yet evident, but is convinced it is only a matter of time. “Real change is happening — a tidal wave of transformation is underway,” Mr. Brynjolfsson said. “We’re seeing more and more facts on the ground.”
It will probably be years before there is a definitive answer to the productivity debate. Mr. Brynjolfsson and Mr. Gordon made a “long bet” last year, with the winner determined at the end of 2029.
But studies at the industry and company levels, tapping data that ranges from Census Bureau business surveys to online job listings, show the pattern of technology diffusion and the obstacles.
The leaders are mainly large companies that have been investing in digital technology for years and high-growth younger companies, which are often backed by venture capital. Cloud computing is fairly widely adopted, but not the most advanced technology, like A.I applications.
The limited uptake, some experts say, is not so surprising at this stage, given that three-quarters of American businesses are small, with fewer than 10 employees.