You’d think, watching companies like Apple break ground on sprawling data centers, that the number of servers powering our untethered lives was on the rise. In a different decade, you might have been right. But not this one. According to a study prepared at the request of The New York Times , the number of servers in use has declined “significantly” since 2005. That’s mostly because of the financial crisis of 2008, says lead researcher Jonathan G. Koomey of Stanford University, but we also can’t discount the effect of more efficient technologies. What’s more, he says, servers worldwide consume less energy than you might have guessed: they accounted for somewhere between 1 and 1.5 percent of global electricity use in 2010. And while Google, the king of cloud computing, has been cagey about revealing just how many servers house its treasure trove of data, the company said that of that 1 to 1.5 percent, it accounted for less than 1 percent — meaning, just a hundredth of a percent of all the electricity consumed last year. All told, data centers’ energy consumption has risen 56 percent since 2005 — a far cry from the EPAs 2007 prediction that this figure would double by 2010, with annual costs ballooning to $7.4 billion. Then again, this slower-than-expected growth could well be temporary. Though Koomey can’t specify to what extent the financial crisis and technological advancements are to blame, he insists, broadly speaking, that we’re primarily seeing fallout from the economic slowdown — a stay of execution, of sorts, for those of us rooting for energy conservation. Report: data centers accounted for just 1 to 1.5 percent of electricity use last year, Google claims less than 1 percent of that originally appeared on Engadget on Tue, 02 Aug 2011 16:06:00 EDT. Please see our terms for use of feeds . Permalink