Chandana Patnaik,Stratview Research March 22, 2024
Utilizing waste heat is the latest frontier in data center efficiency
The increase in power consuming AI hardware highlights the necessity of adopting waste heat reuse solutions in data centers.
For twenty years, data centers have been improving energy efficiency. They continue to search for ways to reduce energy use, more effectively cool, and lower operating costs. In this endless exploration, the latest frontier is the capture and reuse of waste heat.
With the increase of artificial intelligence applications and the use of liquid cooling, the waste heat in data centers will only increase. This growth trend highlights the necessity of adopting waste heat reuse solutions in data centers, not only to improve energy efficiency, but also to reduce environmental impact and operational costs. At the top of the table“
Related news: Dominion said that data centers now require power from reactors
"The heat removal from chips to facilities still poses challenges," said Peter de Bock, Project Director of the ARPA-E COOLERCHIPS project.
Although removing waste heat from facilities is a method, utilizing waste heat for power generation, steam, heating, and even cooling is much more efficient and environmentally friendly. This is a dynamic research field. “
Related news: Experts discuss liquid cooling strategies for managing artificial intelligence heatwaves
"Facilities can reduce cost burden and carbon footprint by finding ways to utilize waste heat," said Kyle Mannini, who is responsible for all laboratories and mechanical systems at the Amherst College Science Center.
Multi functional campus
Brian Rener, the key task leader of SmithGroup, believes that there will be significant opportunities for sustainability and efficiency when data centers are located in multifunctional buildings or park environments. Due to the need for continuous cooling in data centers, they discharge a large amount of waste heat from servers and other devices, which can be utilized in various ways to reduce overall energy consumption and further reduce power efficiency (PUE).
"Due to our expectation that data center electricity consumption will double in the coming years, most data centers are unable to choose to move further north to utilize cool external air as a way to improve sustainability," Rener said. "Alternatively, they can derive value from waste heat by integrating it into mixed use environments, campuses, or more into local communities."“
Intelligent construction
In the movie "Dreamland", Ray Kinsella (played by Kevin Costner) is told that "if they build it, they will come." He did it - in the middle of the corn fields in Iowa. This may have worked in the movie, but Rener doesn't think it's the best location for a data center. Such a location is difficult to maximize the value of waste heat. Instead, his suggestion is to find a larger community to make the data center part of a broader energy solution.
Rener cited some data from the US Energy Information Agency (EIA) to support his assertion that space heating is the largest single energy end use in commercial buildings, accounting for 32%. The cooling rate is only 9%, but the cooling rate in southern states is much higher. The heating consumption of a large building is 1 megawatt. 25 urban blocks require 10 MW, while an 8500000 square foot commercial property requires 100 MW for heating alone.
Now consider data center density. High performance computing (HPC) and AI are increasing rack density to 100 kW or even higher. The aisle is packed with more powerful processors than ever before.
"All processors are getting bigger and hotter," said Shen Wang, Chief Analyst at Omdia. "Since 2000, the power consumption of processors has increased by 4.6 times."
This undoubtedly brings serious problems in terms of power and cooling. But it also opens up an opportunity because chips and servers emit a large amount of hotter waste heat. “
Cooling down university campuses
The Milwaukee School of Engineering has provided a useful case study on waste heat reuse. It has added a computer science building with a supercomputer named Rosie inside. This Nvidia GPU acceleration supercomputer can help students learn AI, UAVs, robots and autonomous vehicle. Although the supercomputer room only occupies 1500 square feet in a 65000 square foot structure, it consumes over 60% of energy. Therefore, the system needs to be seamlessly integrated into building and mechanical/electrical/plumbing infrastructure. This data center is designed specifically for N+1 redundancy, thanks to multiple backup generators and cooling devices, which can maintain functionality even in the event of a single component failure.
The engineering team has established a symbiotic relationship between academic building energy systems and supercomputer systems. For example, computer rooms and teaching buildings use the same cooling system in summer, and the facility's chilled water return pipeline is used for data center supply. By improving the reflux of chilled water, the cooling efficiency of the entire building has been improved. In winter, when the teaching building no longer requires mechanical cooling, computer facilities utilize external cold air through dedicated air-cooled roof condensers and integrated natural cooling circuits.
"The use of waste heat is key to improving data center and building efficiency," said Jamison Caldwell, Chief Mechanical Engineer at Smith Group.
He pointed out that in addition to finding ways to utilize waste heat from hot channels, liquid cooling equipment also emits radiant heat. He studies the energy reuse coefficient, which is calculated as a function of reused energy and total energy consumption.
"There are many opportunities for waste heat at the campus level," said Caldwell. “
NREL energy recovery circuit
The National Renewable Energy Laboratory (NREL) in the United States has established an Energy System Integration Facility (ESIF) aimed at meeting the heating needs of its laboratories and offices through a supercomputer based data center, making the entire building more energy-efficient. It has achieved a PUE of 1.04, making it one of the most efficient data centers in the world.
"100% of office heating is achieved through the reuse of waste heat, reducing water usage by half," said Caldwell. "If more heat is suddenly needed, campus steam can be used."
This is achieved through energy recovery water circuits that span across campus heating and cooling systems, supercomputing systems, and traditional IT systems, and collect waste heat from liquid and air cooling systems. Caldwell described it as building level energy exchange.
"Liquid cooling allows for more advanced cycle heat recovery," he said. “
Incremental benefits
Liquid cooling may lead to a significant shift in data center cooling efficiency and significantly reduce PUE, or at least prevent PUE from rising with the surge in rack density. However, Jason Matteson, Vice President of Iceotope Customer Solutions Architecture, pointed out that the potential benefits of liquid cooling may be wasted due to various waste areas, including lack of hot air control, lack of hidden panels, and failure to replace consumer fans.
"Most data centers have a lot of waste that we can eliminate," Matteson said. “