Cell Networks Use Much More Energy Than Data Centers
Wireless infrastructure tapping into the cloud uses 90 percent of the electricity.
For years, people have talked about the electricity consumption of data centers. Some people want to believe, somehow, that Googling is energy intensive. But it's not. Thanks to Koomey's Corollary to Moore's Law, computation has been getting more energy efficient: The number of computations per kilowatt-hour of electricity usage has been doubling every 1.5 years for decades. Relative to our society's other technological processes -- heating homes or growing corn or ground transportation -- computing's energy usage was and is a drop in the bucket. All of Google, all its servers and campuses and everything, require about 260 megawatts of electricity on a continuous basis, as of 2011. The US has about 1,000 gigawatts of capacity, or 1,000,000 megawatts. So, to put it mildly, I am sanguine about the electrical consumption of our computing infrastructure.
But, according to a new report from the University of Melbourne's Centre for Energy Efficient Telecommunications, the wireless networks that let our devices tap into those data centers might turn out to be another story.
In a new whitepaper, the CEET estimates that when we use wireless devices to access cloud services, 90 percent of the electrical consumption of that system is eaten up by the network's infrastructure, not the servers or phones. The data centers themselves use one-tenth that amount of electricity. Worse, cloud services accessed wirelessly will continue to explode, leading to a ballooning electrical load as well.