U.S. labor productivity nearly doubled between 1995 and 2000, but information technology had less to do with it than most observers think.
Thats the conclusion of a just-released study about U.S. productivity from the McKinsey Global Institute. Presented at the Agenda 2002 gathering of high tech executives in Phoenix, the findings discovered that labor productivity grew annually by 2.5 percent versus 1.4percent a year between 1972 and 1995.
“Our research indicates that IT [information technology] was only one of several factors at work,” the study says. “Innovation (including, but not limited to, IT and its applications), competition, and to a lesser extent cyclical demand factors, were the most important causes. IT investments had a significant impact on productivity in some industries and virtually none in others.”
And productivity was not the only thing that doubled during the last half of the past decade. IT expenditures jumped 17 percent a year during the period, again nearly double the 9 percent annual growth of the previous 23 years.
That spending gave rise to what the study characterizes as “paradox sectors. Those are industries such as hotels and retail banking that spent heavily on IT, but failed to increase productivity.
So what primarily drove the productivity gains in the industries that enjoyed the biggest increases (semiconductors, computer manufacturing, telecom, securities and retail)? Sometimes it was the structural changes such a superior business model or managerial innovation, according to New York-based McKinsey. Other reasons were cyclical surges in demand, which may not be sustainable, says the study. IT played a role though, especially where technology was applied vertically in industry-specific applications such as bar code scanners and warehouse management in the retail sector.
“In most cases, however, IT is just one of many tools that creative managers use to redesign core business processes, products or services,” the study concludes.