The good news: U.S. IT workers are, on average, putting in fewer hours. The bad news: Whoever doesnt get laid off is churning out more work than ever and, in the long run, that will lead to burnout among many IT workers.
In its 2002 World Wide Benchmark Report, Meta Group found that of 1,100 companies surveyed since the start of this year, 40 percent are cutting IT budgets by more than 20 percent. The budget cuts will lead to a 53 percent reduction in staff among that group. Because of the budget-hacking, fewer workers are left to do more work—indeed, the study found that U.S. development productivity, measured in KLOCs (thousands of lines of code), has increased to 36 per professional per year, up from 6.22 last year.
That productivity is getting squeezed into ever-tighter schedules. The study found that the average IT professional in the United States works 2,080 hours per year—down from 2,157 last year. That compares with the average worldwide IT worker, who now works 1,992 hours per year, compared with 2,151 last year.
Part of what accounts for the productivity spike, according to Meta spokeswoman Samantha Finnegan, is the use of tools such as Visual Basic, HTML and JavaScript. “[JavaScript, for example,] lets IT professionals work faster,” said Finnegan, in Stamford, Conn. “It doesnt take as long as regular Java for workers to input more script.”
One consolation is that many IT professionals are getting paid more this year than last: Compensation is up 9 percent in the United States, with the greatest increases going to project leaders, business analysts and metric specialists. Worldwide, enterprises are having the toughest time filling the positions of system analysts and designers.
The survey found that 10.5 percent of budgets are now spent on outsourcing, vs. 12.5 percent last year. Among organizations using outsourcing, 14.8 percent is spent on data centers; 45.1 percent on applications development; and 13.8 percent on help desks.