News: 1713792614

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

More than a third of enterprise datacenters expect to deploy liquid cooling by 2026

(2024/04/22)


Survey As CPUs and GPUs grow ever denser and power-hungry, many, including Register readers, expect liquid cooling to play a larger role in enterprise datacenters over the next few years.

More than a third of enterprises (38.3 percent) expect to employ some form of liquid cooling infrastructure in their datacenters by 2026, up from just 20.1 percent as of early 2024, according to a survey of 812 IT professionals conducted by The Register this spring.

Liquid cooling isn't just for HPC and AI

Today, liquid cooling remains a niche, with the majority seeing the tech as most beneficial for high-performance computing (64.4 percent), followed closely by dense server configurations (60.6 percent), and to a lesser extent artificial intelligence workloads (46.2 percent).

This makes sense as liquid cooling has traditionally been employed in densely packed supercomputing cabinets from the likes of Eviden, HPE Cray, and Lenovo. These systems are complex and rely on large coolant distribution units, chillers, and facility water systems. By comparison, most AI systems up until recently have been air-cooled.

As we saw at GTC, this trend could soon change. While Nvidia's HGX B100 and HGX B200 systems will still be available in air-cooled form factors, its most powerful accelerators like the 2,700-watt Grace-Blackwell Superchip will [1]require liquid cooling.

[2]

Despite the hype, adding AI capabilities was far from enterprises' highest priority, with roughly 58 percent of respondents saying improving facility security was their biggest pain point, followed by reducing energy consumption, increasing utilization of existing hardware, and acquiring higher performance systems all ranking above AI capabilities at 27 percent.

[3]

[4]

Liquid cooling isn't limited to AI and HPC systems. It also happens to be much more efficient at removing waste heat than air. As we've previously [5]discussed , 15-20 percent of power consumption can be directly attributed to the fans used to move air through these systems. Transitioning to liquid cooling, depending on the technology involved, largely eliminates the need for high RPM fans, reducing power consumption considerably.

Combined with the opportunistic boost algorithms found on most modern processors, liquid-cooled systems should in theory be capable of achieving higher clock speeds than their air-cooled siblings.

Enterprises still undecided on direct to chip vs immersion cooling

Despite the advantages liquid cooling offers, readers remain split as to which version of technology they will ultimately deploy.

[6]

By 2026 more than a third of respondents expect to employ some form of liquid cooling in their datacenters – click to enlarge

By 2026, 16.3 percent said they were going all in on direct-to-chip (DTC) liquid cooling, which replaces heat sinks with cool plates through which warm or chilled water or coolants are pumped. By comparison, 6.5 percent said they planned to go 100 percent immersion cooling. This technology involves submerging the entire system in either single-phase fluids like synthetic oils, or two-phase fluids engineered to boil at or around the chips' operating temperature.

About a sixth of respondents said they planned to use a mix of DTC and immersion cooling in their datacenters, while 61.7 percent said they had no plan to utilize either technology in the next two years.

[7]

Unsurprisingly, the largest enterprises expect to adopt liquid cooling the fastest. This could be down to a couple of factors, ranging from larger budgets for AI deployments or limited datacenter space necessitating denser rack configurations.

Most enterprises probably don't need liquid cooling just yet

[8]

The majority of those surveyed have rack power densities below what is generally considered necessary for liquid cooling – click to enlarge

Speaking of current rack power trends, it's not hard to see why so many enterprises are sticking with air cooling in the near term. About 87 percent of respondents reported rack densities of 50 kW of lower. That's the upper end of what Digital Realty CTO Chris Sharp [9]told our sibling site The Next Platform its facilities could support without resorting to rear-door heat exchangers (RDHx) or DTC cooling.

Practically speaking, we [10]tend to see RDHx – essentially rack-sized radiators used to chill hot air exiting servers down to acceptable levels – used in racks exceeding around 40 kW. For reference, that's roughly the load expected for a stack of four DGX H100 systems.

Just 6.7 percent of respondents said their average rack power was between 51 kW and 100 kW a rack, while 6.3 percent said it exceeded 100 kW. We've seen some larger RDHx systems that can handle air-cooled systems up to around 90 kW of thermal dissipation.

[11]

The survey found that as enterprises grow larger they trend toward denser rack configurations – click to enlarge

Here, again, there aren't many surprises with enterprises trending toward higher rack densities the larger they get.

[12]Supermicro CEO predicts 20 percent of datacenters will adopt liquid cooling

[13]Liquid cooling specialist snags Microsoft datacenter wizard as advisor

[14]What Nvidia's Blackwell efficiency gains mean for DC operators

[15]How thermal management is changing in the age of the kilowatt chip

Cost and reliability remain key concerns

But while readers expect to deploy liquid cooling more broadly across their infrastructure over the next few years, there remain challenges and concerns regarding adoption.

[16]

Maintenance and complexity followed by the cost of implementation were two of biggest barriers to adoption for liquid cooling cited by survey respondents – click to enlarge

Among their top concerns were maintenance, complexity, and the initial cost of implementation. Liquid-cooled systems require additional resources, facility water, CDUs, and in the case of DTC, rack manifolds to distribute the coolant to the individual systems.

Existing datacenters can be retrofitted to support liquid cooling using in-aisle coolant reservoirs and liquid-to-air CDUs. However, the cooling capacity is generally lower with these kinds of approaches.

Following cost and complexity, 48.6 percent of respondents cited a lack of experience with the technology, and 41 percent expressed fears over leaks and spills. Finally, 21.4 percent said the cost of buying and replacing coolant as a potential challenge.

[17]

While liquid cooling does introduce additional complexity and points of failure, the technology is by no means new, having been used for decades in supercomputers, HPC-centric clusters and render farms, and more recently large-scale GPU and accelerator farms for training generative AI.

Increased interest in liquid cooling has given rise to preventative measures like negative pressure coolant loops designed to [18]minimize spills in the case of a leak, and in rack CDUs with redundant or modular pumps. ®

Get our [19]Tech Resources



[1] https://www.theregister.com/2024/03/18/nvidia_turns_up_the_ai/?td=rt-9c

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2ZiaJnokOFE-d7Tbaotn3XQAAAI0&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44ZiaJnokOFE-d7Tbaotn3XQAAAI0&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33ZiaJnokOFE-d7Tbaotn3XQAAAI0&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://www.theregister.com/2023/12/26/thermal_management_is_changing/

[6] https://regmedia.co.uk/2024/04/19/lc_survey_cooling_mix.png

[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44ZiaJnokOFE-d7Tbaotn3XQAAAI0&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[8] https://regmedia.co.uk/2024/04/19/lc_survey_rack_power.jpg

[9] https://www.nextplatform.com/2023/07/31/where-to-park-your-ai-cluster-is-as-important-as-procuring-it/

[10] https://www.theregister.com/2024/04/16/amd_tensorwave_mi300x/

[11] https://regmedia.co.uk/2024/04/19/lc_survey_rack_power_2.png

[12] https://www.theregister.com/2023/09/29/supermicro_30th_birthday_liquid_cooling_prediction/

[13] https://www.theregister.com/2024/04/05/iceotope_microsoft_exec_advisor/

[14] https://www.theregister.com/2024/03/27/nvidia_blackwell_efficiency/

[15] https://www.theregister.com/2023/12/26/thermal_management_is_changing/

[16] https://regmedia.co.uk/2024/04/19/liquid_cooling_challenges_survey.png

[17] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33ZiaJnokOFE-d7Tbaotn3XQAAAI0&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[18] https://chilldyne.com/negative-pressure-liquid-cooling/

[19] https://whitepapers.theregister.com/



The problem with any unwritten law is that you don't know where to go
to erase it.
-- Glaser and Way