Researchers at Penn State Develop AI System to Optimize Data Center Cooling
Researchers from Penn State have developed an innovative artificial intelligence system capable of utilizing real-time weather data and electricity prices to manage cooling in data centers. This technology can reduce energy consumption by approximately 25% without the need to upgrade existing equipment.
Researchers from Penn State have unveiled a groundbreaking artificial intelligence system designed to optimize cooling in data centers by leveraging real-time data on weather conditions and electricity prices. This innovative technology is expected to cut energy consumption by around 25%, providing a significant step towards reducing energy costs in an era where the demand for cloud computing and artificial intelligence is surging, thereby increasing the load on data centers and escalating electricity expenses.
According to a report by Interesting Engineering, the development is crucial as cooling represents one of the largest expenses for data centers, consuming a substantial portion of energy. The researchers propose replacing traditional fixed cooling settings with adaptive ones, allowing for a more efficient use of energy resources.
The system developed by the scientists employs software based on an artificial intelligence model that takes into account physical principles. It analyzes climatic conditions and economic data in real-time, providing recommendations for optimizing cooling. For instance, the system can ramp up cooling in data centers when electricity is cheaper and reduce the load during times of rising prices, all while remaining within safe operational parameters.
Wangda Zuo, a professor of architectural engineering, noted that cooling currently accounts for about 40% of a data center's total electricity consumption, necessary solely for maintaining its operations. Traditional cooling systems typically use fixed temperature targets, which can lead to financial losses during spikes in electricity prices.
To train the artificial intelligence, the researchers utilized a digital twin—a virtual replica of a data center that simulates temperature, humidity, and equipment constraints. The model combines engineering rules with machine learning methods, enabling practical and safe decision-making. The system was tested on a simulation of a data center in Houston, where high temperatures and humidity create challenging operating conditions.
Graduate student Viswanathan Ganesh explained that each component of the cooling system has its operational limits, which were taken into account in the model. This consideration enhances cooling efficiency without risking equipment damage and reduces the need for large datasets for training.
This development can also be applied in the cryptocurrency mining sector, which requires significant computational resources and operates continuously. Optimizing cooling according to weather conditions and electricity prices can substantially enhance the profitability of such operations.
The researchers emphasize that their software solution could serve as a cost-effective alternative to infrastructure upgrades, such as transitioning to liquid cooling systems. The results of this research will be presented at the IEEE ITherm conference scheduled for May this year.