Liquid Cooling in Computer Systems
Liquid cooling uses fluid to absorb and transfer heat away from electronic devices. Computer processors – or chips – produce significant heat during operation. Effectively removing this heat is critical for their operation. While most processors today are air-cooled, increased heat density is leading to mainstream adoption of liquid cooling. The current generation of AI processors, including GPUs, CPUs and ASICs all require liquid cooling to operate today.
Entering the Mainstream
Since the invention of the integrated circuit in the late 1950s, each new generation of processor has been largely a function of increased transistor density. This increased circuit density leads to power density. Although liquid cooling was first introduced to manage heat in mainframe computers in the 1960s, various advances in microprocessor, packaging and systems design largely circumvented its need until now.
Liquid cooling is increasingly critical to the reliable operation of today’s ultra-dense data center compute systems.
AI chips contain hundreds of billions of densely packed transistors, creating a thermal power density over 100 W/cm², exceeding the thermal limits of air-cooled solutions. Additionally, these chips are densely packed together, compounding heat density at both the server and rack levels. Not so long ago, 10kW racks were considered dense. Now, AI and HPC racks exceeding 100kW are the norm. In several years, rack densities are expected to exceed 1,000 kW.
Many Benefits
The main benefits of liquid cooling are:
Higher heat removal capacity. Liquid cooling can manage heat densities of >100W/cm², making it possible to cool today’s high-power AI chips.
Targeted cooling. Coolant can be directed at hot spots on a chip and the hottest components inside a server to ensure uniform cooling.
Support for high-density racks. Liquid cooling allows for >100 kW individual rack densities – essential in operating dense AI and HPC clusters now and into the future.
Improved reliability and performance. Liquid keeps temperatures stable, preventing chips from performance loss through throttling and reducing degradation from thermal cycling.
Improved energy efficiency. Liquid cooling uses less energy by reducing or replacing fans and air conditioning systems.
Reduced energy use. Liquid cooling makes using 20°C to 45°C water to remove heat – known as warm water cooling – possible, reducing or eliminating the need for energy-intensive and noisy compressors or chillers.
Greater sustainability and heat reuse. Liquid cooling reduces energy consumption and enhances energy efficiency, thereby lowering data centers’ carbon footprint. Captured warm water can also be reused rather than rejected.

Cooling Fluids
Liquid cooling begins with understanding why liquids outperform air. Liquid has much higher thermal conductivity and heat capacity, meaning it can more rapidly move heat away from components and hold more heat. Liquid can also be directed precisely to the hottest components.
Water is the primary ingredient in most computer-cooling fluids due to its availability and exceptional thermal properties. It conducts heat roughly 25× better than air and can absorb about 4,000× more heat. However, pure water is rarely used on its own. Most systems rely on water–glycol mixtures because glycol adds important protections, namely preventing freezing, inhibiting corrosion and eliminating biological growth.
Other fluids can also be used. Heat can be captured by submerging electronics into non-conductive dielectric fluids. Some systems immerse electronics in non-conductive dielectric fluids to capture heat directly. Refrigerants are typically used in two-phase cooling, where heat is removed by boiling the liquid into vapor rather than simply warming it, as in single-phase systems.
Systems Design
Single-phase direct-liquid cooling (DLC) is the most widely adopted form of liquid cooling. This has become the industry’s preferred method due to its simplicity, modularity, cost-effectiveness and technical maturity. Industry is actively investigating two-phase DLC and immersion cooling systems but in data centers these remain at the niche application and R&D phase.
Single-phase DLC involves attaching a coldplate directly to a processor or other components on a server. Coldplates are blocks made from copper or another thermally conductive material with internal fluid channels. A coolant distribution unit (CDU) circulates coolant to multiple connected coldplates. CDUs also act as heat exchangers, rejecting absorbed heat into a data center’s facility heat management system.
The closed cooling loop that circulates coolant between the CDU and the coldplates is called the Technology Cooling System (TCS), while the building’s cooling loop that receives heat from the CDU is called the Facility Water System (FWS). Simply put, a TCS directly cools chips, servers and racks, where the FWS transports heat from CDUs to a data center’s external heat rejection systems. Instead of discarding heat to the air, some data centers use the heat for nearby commercial or residential buildings or for industrial use.

Single-phase DLC has become the de facto choice because it uses familiar water-glycol fluids, works with existing server designs and is proven at scale. The technology offers excellent heat removal and efficiency without the design complexity, high cost, service challenges or regulatory hurdles of immersion and two-phase systems.
Interoperability is another single-phase DLC benefit. Compute cluster operators can choose from multiple vendors and scale cooling designs from a single server rack to a multiple-row compute pod.
