Recent research has found that data centers have not been burning energy as rapidly as people feared. The growth isn’t out of control right now, but this may be just a temporary stabilisation, before a new period of growth. This time around, before any surge hits, we need to be prepared. We need to understand whether this energy usage is a problem and, if so, how best to deal with it.
First, the good news. Ten years ago, a report from Lawrence Berkeley National Laboratory (LBNL) warned that data center power usage was out of control. Social media and other data at that time was growing very rapidly, and it was feared that the facilities to service these demands would have to keep expanding and using more energy. That report is still widely quoted, and causing concern: we know that the amount of data stored and pumped around data centers is continuing to grow, so we all feared that the energy use must surely be expanding without limit. Turns out, that isn’t true.
Last year, an updated report from LBNL found that energy use by US data centers (and by implication, others round the world) has actually levelled off. The computing delivered is still growing rapidly. If anything, it is accelerating because, as well as social media, there are massive expansions in the data consumed by Internet of Things (IoT) devices, and the data sets used for big data. The emerging fields of AI and Deep Learning are also expected to begin demanding a massive amount of data in the next few years.
Amazingly, while the data delivered keeps growing, the data centers are operating much more efficiently. They are delivering thousands of times more computing power for every watt of electrical power they consume. That’s due to several combined effects.
Firstly data centers are wasting less energy on cooling. The industry came up with a metric, PUE (power usage effectiveness), and attempts to drive that closer to 1.0 have slashed the energy used in most new data centers. Secondly, data centers are now more virtualised. Processors and storage can be pooled, there is less stranded capacity and facilities which used to have about 10 percent utilisation, can now reach closer towards 100 percent. Thirdly and finally, Moore’s Law - the improvement in the processor power as electronic circuits shrink - means that servers themselves deliver more computation per watt.
This is all good, but experts are starting to point out that these three effects all reach a limit. You can’t have a better PUE than 1.0, you can’t get more than 100 percent utilisation from your IT resources, and Moore’s law hits a wall when the size of atoms starts to impinge on miniaturising components.
This point was made eloquently by Dale Sartor of LBNL earlier this year at Energy Smart, a summit convened by Data Center Dynamics in San Francisco, which brought together academics and executives from the power and data center industries.
Don Paul of the Energy Institute at the University of Southern California, explained that demand was insatiable. Because of these vast improvements in efficiency, the increase in computing power has been delivered “for free”. There is no perceived cost to using a web service on a smartphone, so there is no limit to how much people will use. (This is Jevon’s Paradox. If you make a resource more efficient, it actually boosts consumption as demand grows).
Data center operators have done a fantastic job delivering the services people are using. Such a good job that the Internet industries are still less power demanding than many other sectors. But if demand growth doesn’t change, this will not last. As Paul told the summit, “we always consume our efficiency gains”, eventually power usage will rise again".
The only way to break this would be to modify demand. This will be different for social media types of application, and for the automated systems in the IoT and AI spaces. Cutting social media demand would have to be a co-operative effort, because there is no global authority, no power on earth, which can force users to post fewer updates to Facebook or cat videos to Youtube.
The summit considered whether it might help if we can make energy usage visible on smartphones, and “gamify” the system to encourage users to compete in their energy reduction. Cooling down the other sources of demand growth would have to be a different story. As IoT and AI are not consumer applications, other approaches will work.
I have a suspicion that a lot of automated data-based services, actually gather far more data than is useful, just in case it might prove significant, and because gathering more is easier than making a decision. I’d like to see a code of practice that suggests ways to limit the data gathered, to a useful level, saving time and energy.
Note: You can read more of Peter's blogs at Datacenter Dynamics here.