Cascade Technologies approached CoolIT Systems with a data center density problem. The team required a significant increase in onsite computing power to handle the setup and testing of its complex simulations. However, this computing upgrade project had to be accomplished within the existing IT floor space and the current air conditioning system that had already reached maximum capacity.
As a global leader in efficient data center management, PSNC had explored many energy saving technologies over the years to further reduce its carbon footprint. Using liquid as a cooling medium is a key efficiency strategy, and PSNC has strived to cool as much IT as possible using a warm water circuit of approximately 40°C.
Liquid Cooling for Heat Reuse: Center for Biological Sequence Analysis at Technical University of Denmark
The Center for Biological Sequence Analysis (CBS) at the Technical University of Denmark is striving towards CO2 neutrality in its High Performance Computing (HPC) facility known as "Computerome." An equally important goal is provisioning these computational resources as efficiently as possible within a modular data center environment. CoolIT Systems accepted the challenge and delivered a liquid cooled system that dramatically lowered energy consumed for cooling and provided a workable system to recycle the waste heat from the servers.
CoolIT Systems Deploys Rack DCLC CHx40 System to Poznan Supercomputing & Networking Center in Partnership with Huawei and Itprojekt
Poznan Supercomputing & Networking Center (PSNC) was looking for a safe and reliable way to increase HPC compute density while lowering overall operating costs. In addition, PSNC wanted to leverage the lower operating temperatures from liquid cooling to increase processor performance.
Intel Selects CoolIT Systems to Liquid Cool Top500 "Cherry Creek" Cluster - Live on the Show Floor at SC13
This HPC cluster was to fit in two standard racks using Supermicro form-factor (Fat Twin Chassis system) and needed to operate fully loaded on the exhibition floor at SC13. As a conference showcase system, there was no ability to access any facility liquid to help remove the heat load. Intel’s goal was to use off-the-shelf IT equipment, software, and cooling to deploy a HPC cluster, in less than 90 days, that would rank on the Top500 and Green500.
Intel's vision was simple – cool the full system with liquid, on an exhibition floor where the temperature may vary, and make it quiet enough that it is not a distraction on the floor. The formula for performance from a hardware standpoint is pretty straightforward – put as many Xeon Phi and Xeon CPUs into a rack as possible. The other relationship CoolIT is familiar with is that the more power going in, the more heat that needs to come out.
CoolIT’s Rack DCLC™ has been installed at the University to enhance research into energy-efficient aspects of data centers, by combing the research expertise from two disciplines to understand the interaction of cloud scheduling algorithms (extensively tested on Google datasets) with a fine controlled direct contact liquid cooling methodology.