Cell Networks Are Energy Hogs

annualcloudconsumption.jpg
annualcloudconsumption.jpg

For years, people have talked about the electricity consumption of data centers. Some people want to believe, somehow, that Googling is energy intensive. But it's not. Thanks to Koomey's Corollary to Moore's Law, computation has been getting more energy efficient: The number of computations per kilowatt-hour of electricity usage has been doubling every 1.5 years for decades. Relative to our society's other technological processes -- heating homes or growing corn or ground transportation -- computing's energy usage was and is a drop in the bucket. All of Google, all its servers and campuses and everything, require about 260 megawatts of electricity on a continuous basis, as of 2011. The US has about 1,000 gigawatts of capacity, or 1,000,000 megawatts. So, to put it mildly, I am sanguine about the electrical consumption of our computing infrastructure.

But, according to a new report from the University of Melbourne's Centre for Energy Efficient Telecommunications, the wireless networks that let our devices tap into those data centers might turn out to be another story.

In a new whitepaper, the CEET estimates that when we use wireless devices to access cloud services, 90 percent of the electrical consumption of that system is eaten up by the network's infrastructure, not the servers or phones.. The data centers themselves use one-tenth that amount of electricity. Worse, cloud services accessed wirelessly will continue to explode, leading to a ballooning electrical load as well. By 2015, they estimate this system could eat up between 32 and 43 million megawatt hours. In 2012, the figure was only 9 million megawatt hours.

Worse still, while computing has been getting more efficient at a fast and predictable clip, it's harder to tell whether the wireless access systems that the report fingers as the problem will advance as rapidly. They certainly have not advanced as predictably as computing along other metrics (reliability or bandwidth, say).

What portion of the system is using the bulk of the energy? It turns out to be the 4G LTE links. Looking at 2010 data, "the energy consumption of a 4G LTE wireless access link ranges between 328 micro-Joules per bit and around 615 micro-Joules per bit," according to this paper on the energy efficiency of cellular networks. Those numbers are improving 26 percent a year, the CEET writes. Given that range, I don't want to try to calculate too precisely, but roughly, downloading a megabyte of data over an LTE connection would use a couple of kilojoules of electricity.

That's still not a lot, but imagine multiplying by the scale of YouTube or by the number of wireless Internet users in China. The numbers, as the CEET discovered, can get very big very fast.






More From The Atlantic

Advertisement