wired | In essence the system has been created to respond to the demand that is being put on it and reduce the amount of electricity needed when it is possible to do so.
Suleyman's Dougal team – a division of DeepMind building projects for direct use within Google – created the algorithms using deep neural networks. The network type aims to mimic the functionalities of the brain and have been used in everything from creating an artificial Donald Trump to treating serious diseasessuch as Alzheimer's.
The DeepMind team collected five years worth of data collected by data centres and created a prediction model for how much energy would be needed by the data centre based on the amount of server usage that was likely. Each neural network was fed data on temperatures, power usage, pump speeds and more.
By using the large data sets, the machine learning was able to be "trained" and retain more examples of how the centres' work than a human would be able to.
"Conventionally a human manually tweaks a lot of the knobs that control the operation of the data centre," Suleyman explained. "There's obviously a lot of variation in performance across all the data centres because each human performs quite different."
When controlling a running data centre, in recent months, Google said the AI was able to "consistently achieve a 40 per cent reduction in the amount of energy used for cooling".
The algorithms created were to be a general learning artificial intelligence. This means it may be possible to apply it to other scenarios. "There's lots of other applications outside of Google," Suleyman said.
"We think there's lots of potential to apply this to large scale energy distribution, so we're giving it some thought and are in early discussions with a number of people on that."