Google’s uber-geeky research into neural networks — to make computers work more like human brains — looks a lot smarter.
While most of us were thinking that research would turn out speech recognition consumer products, it actually turns out that Google has applied its neural networks to the challenge of making its vast data centers run as efficiently as possible, preventing the unnecessary consumption of energy and thus saving money.
The initiative began as a 20 percent project at Google, in which data center engineer Jim Gao started making machine learning models to predict and improve key data center metrics liker power usage effectiveness (PUE), according to Joe Kava, vice president of data centers at Google, who wrote about the technology in a blog post today.
Rather than throw artificial neural networks at speech and image recognition, as Google has done before, or recommendation systems, like web companies such as Netflix, Google is aiming this experimental style of computing at a true IT challenge, one that affects its bottom line. That’s important, and it could eventually trickle down to more businesses in the years to come.
Of course, more efficient energy usage also benefits the environment.
The work is already paying off at Google.
“[A] couple months ago we had to take some servers offline for a few days — which would normally make that data center less energy efficient,” Kava wrote. “But we were able to use Jim’s models to change our cooling setup temporarily — reducing the impact of the change on our PUE for that time period. Small tweaks like this, on an ongoing basis, add up to significant savings in both energy and money.”
Data center energy efficiency and neural network fans should enjoy a new white paper from Gao that goes into detail about the work.