BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News DeepMind AI Program Increases Google Data Center Cooling Power Usage Efficiency by 40%

DeepMind AI Program Increases Google Data Center Cooling Power Usage Efficiency by 40%

Bookmarks

Alphabet's DeepMind division reports they improved the overall power usage efficiency (PUE) of Google's data centers by 15 percent after placing an AI program similar to a program taught to play Atari games in charge of managing a data center's control systems. DeepMind and data center engineers report they've improved the cooling systems PUE consistently by up to 40 percent and that the program had achieved the lowest PUE the data center site had ever seen.

Demis Hassabis noted that it was not only a cost savings, but also reduced the environmental impact of their data centers. Google reportedly used 4,402,836 MWh of electricity in 2014, or the equivalent of 366,903 U.S family homes according to Google Green. They've provided a carbon footprint estimate for serving an active Google user, which is defined as:

someone who does 25 searches and watches 60 minutes of YouTube a day, has a Gmail account and uses our other services, for whom Google emits about 8 grams of CO2 per day to serve. In other words, serving a Google user for a month is like driving a car one mile.

According to an initial report, the offset in cost savings could be hundreds of millions of dollars in savings over multiple years and will in-part, if not completely, pay for the 400 million pound, or more than $600 million DeepMind acquisition. It will also reportedly reduce their data center's carbon-footprint-per-user. On how the program achieved the efficiency gains, DeepMind research engineer Rich Evans and Google data center engineer Jim Gao stated they:

accomplished this by taking the historical data that had already been collected by thousands of sensors within the data centre, data such as temperatures, power, pump speeds, setpoints, etc. and using it to train an ensemble of deep neural networks... then trained the neural networks on the average future PUE, which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints.

Individual data center characteristics like climate and weather, each center's unique site-specific architecture, and the interplay between different systems throughout the day had previously made creating a universal equation for optimizing PUE not possible. With the deep learning, convolutional neural network approach a single equation isn't necessary because the program can learn to play in a game-like manner from the inputs being fed in from sensors and a reference to ideal outcomes. The engineers demonstrated how the PUE was affected for the site when the program was turned on and off. Hassabis said they had learned where gaps in their data center data capture were and that additional sensors would be deployed to further increase efficiency.

According to DeepMind, the same technology could potentially be used to improve power plant conversion efficiency, reducing semiconductor manufacturing and water usage, or helping manufacturing facilities increase throughput.

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT