Sustainable Energy Sources to Power Cloud Computing
Frankly speaking, the title of this article is somewhat optimistic, considering what I am about to write about is merely a study on the subject. However, considering the big names behind this research project, it is not too much of a leap of faith to expect some concrete directions towards the use of sustainable energy sources to power cloud computing.
Exploration of the relationship between energy consumption and cloud computing is not new on this website. Earlier, I had written on opposing studies that have feted and vilified cloud computing as regards its environment-friendliness (See: How Green Is Cloud Computing? and Environmental Challenges to Cloud Computing ). I had also discussed not only saving money on energy through cloud computing (See: Saving Money on Energy by Going on the Cloud but actually earning money through the process (See: Harnessing Data Center Heat to Warm Houses). Today’s article explores the possibility of marrying renewable energy sources and cloud computing.
Semiconductor company Advanced Micro Devices (AMD), IT giant Hewlett Packard (HP), New York State Energy Research and Development Authority (NYSERDA) and Clarkson University have got together to develop mechanisms for more efficient use of renewable energy to power cloud computing. Other corporate partners include GE Global Research Center, Ioxus, AWS TruePower, Vento Tek, Timbre, Intertek, WindE Systems, and Ballard Power Systems. While wind power is initially the focus of the research, solar power may also be explored at a later date.
The logic behind this initiative is simple: since data on the cloud can be easily moved around, it makes sense to shift it where electricity supply is ideal, or in this case, where maximum wind power is being produced but consumption is less. Ultimately, the researchers hope to design a network of renewable energy-driven Performance Optimized Data Centre (POD) systems located at sites with optimum exposure to high winds.
“The distributed computing model of the cloud parallels the distributed power-generation model of solar and wind energy. Directing power to data centers from these emerging renewable energy resources without relying on a large-scale, traditional electrical grid is a key challenge,” said Alan Lee, corporate vice president of Research and Advanced Development, AMD. “One ultimate goal is the co-location of dynamic energy sources with dynamic computing resources to improve the economics, performance, and environmental benefits of both infrastructures.”
“If successful and deployed on a larger scale, this project could bring significant energy savings to an industry that can consume 1 MW of electricity at times of peak operation,” NYSERDA said in a statement. 1 MW is actually the power consumed by a single average-sized data center; therefore, one can imagine the total power consumed by the entire industry, and it has estimated to be as large as 2.2% of the total throughout the country.
This idea actually throws up several interesting possibilities. In recent times, companies have made decisions regarding data center locations mainly dependent on the cost of infrastructure, specifically electricity and bandwidth. That is why Seattle is a hotbed of recent development in this industry. However, if this idea works, subject to the conditions of cheap electricity from wind and lack of latency in data transfers, the windiest locations in the US may well steal a march on Seattle and its ilk.
By Sourya Biswas