Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Artificial intelligence (AI) is running one of Google’s data centers — or at least the cooling system in said data center. Today in a blog post, the Mountain View, California company said that it has turned over management of cooling controls to an AI-powered recommender system it jointly developed with DeepMind, its U.K.-based AI research subsidiary. Google claims it’s the first fully autonomous model of its kind.
“We wanted to achieve energy savings with less operator overhead,” Dan Fuenffinger, a data center operator at Google, said. “Automating the system enabled us to implement more granular actions at greater frequency, while making fewer mistakes.”
So how does it work? Every five minutes, Google’s cloud-hosted AI grabs data from the thousands of sensors — including temperatures sensors, power meters, and more — in the data center and feeds it into a deep neural network, a type of AI modeled after neurons in the brain. The model takes into account energy consumption and safety constraints before deciding on a course of action, which it delegates to local control systems.
The AI considers billions of potential actions every five minutes, according to Google, and predicts which are most likely to lead to desirable outcomes. (Actions with low confidence aren’t considered.) To prevent the occasional wrong decision from slipping through, it vets instructions against a list of constraints specified by human operators. And as an added precaution, it’s been trained to prioritize “safety and reliability” over performance and cost savings.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Already, the system has delivered impressive gains. Over a nine-month period, it increased the data center’s performance from 12 percent to 30 percent, in part by “learning” tricks to manage cooling more efficiently. In the winter, for example, it took advantage of the cold weather to produce “colder than normal” water, which reduced the energy required for cooling within the datacenter.
Joe Kava, vice president of data centers at Google, told MIT Technology Review that the project could generate “millions of dollars” in energy savings and could help lower carbon emissions.
“We’re excited that our direct AI control system is operating safely and dependably, while consistently delivering energy savings,” Google wrote. “However, data centers are just the beginning. In the long term, we think there’s potential to apply this technology in other industrial settings, and help tackle climate change on an even grander scale.”
It’s not the first time Google’s handed a datacenter’s reins over to AI. In 2016, it implemented a system developed by DeepMind that provided recommendations to human overseers. In the Mountain View company’s tests, it achieved a 40 percent reduction in the amount of energy used for cooling and a 15 percent reduction in overall power usage effectiveness — the ratio of the total building’s energy usage to its IT energy usage.
Given that the amount of energy consumed by data centers — which already accounts for 3 percent of global electricity usage and 2 percent of total greenhouse gas emissions — is expected to triple in the next decade, the improvements can’t come fast enough.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.