Der Jahresumsatz der Flüssigkeitskühlung von Rechenzentren wird bis 2033 900 Millionen US-Dollar übersteigen.

Wärmemanagement für Rechenzentren 2023-2033

Luftkühlung, Flüssigkeitskühlung, direkte Kühlung auf den Chip/Kühlplatte, Tauchkühlung, einphasige und zweiphasige Kühlung, Kühlmittelverteilungseinheiten (CDU), Pumpen und Hyperscale-Rechenzentren.


Show All Description Contents, Table & Figures List Pricing Related Content
With the increasing demand for high-performance computing in sectors like AI, cloud computing, and crypto mining, the thermal design power (TDP) of chips has risen significantly over the past 16 years, experiencing a four-fold increase from 2006 to 2022. In 2023, IDTechEx has observed servers with IT loads exceeding 750W and this upward trend in TDP has propelled a need for more efficient thermal management systems at both the micro (server board and chip) and macro (data center) levels. In recent years, leading data center users have collaborated with various cooling solution vendors to launch innovative pilot projects and commercialize ready-to-use cooling solutions, aiming to enhance cooling performance and meet sustainability targets by adopting more efficient cooling solutions.
 
IDTechEx's report on Thermal Management for Data Centers covers a granular market forecast of data center cooling technologies segmented by data center power capacity. The report also provides a granular forecast of thermal interface materials (TIMs) usage by data center component.
 
Cooling Overview
Data center cooling methods can be broadly categorized into air cooling and liquid cooling, depending on the cooling medium employed. Air cooling relies on air conditioning and/or fans to blow air, utilizing convection to dissipate heat from the servers. It has been widely adopted due to its long and successful track record. However, the low specific heat of air makes it challenging to meet the increasing cooling capacity requirements. Additionally, as data center users strive to maximize rack space utilization by densely packing servers (typically 1U servers), the air gaps between servers become narrower, which further reduces the efficiency of air cooling.
 
Liquid cooling, on the other hand, takes advantage of the higher specific heat of liquid to achieve superior cooling performance. Depending on the which components the fluids contact, liquid cooling can be classified into direct-to-chip/cold plate cooling, spray cooling, and immersion cooling. Direct-to-chip cooling involves mounting a cold plate with coolant fluid directly on top of heat sources such as GPUs and chipsets, with a thermal interface material (TIM) applied in between. Cold plate cooling can achieve a partial power use effectiveness (pPUE) ranging from 1.02 to 1.20, depending on the specific configuration.
 
An emerging alternative is immersion cooling where the servers are fully submerged in coolant fluids, enabling direct contact between the heat sources and the coolant, thereby achieving the best cooling performance with the lowest pPUE of 1.01. However, its widespread adoption is still limited due to challenges such as complexity, limited industry expertise, and high upfront costs (in terms of US$/Watt). Nevertheless, immersion cooling holds potential for long-term energy savings, which is not only economically beneficial given the current context of energy crisis but also helps the large companies to achieve their energy saving and sustainability goals in the long run. IDTechEx's comparative analysis of the total cost of ownership (TCO) between traditional air cooling and immersion cooling reveals that the payback time for immersion cooling is approximately 2.2 years, subject to the assumptions listed in the report. This report also highlights strategic collaborations and pilot projects between data center end-users and immersion cooling vendors, as well as other barriers hindering the widespread adoption of immersion cooling. Market adoption and revenue forecasts are provided for air, cold-plate, and immersion cooling through to 2033.
 
Liquid cooling can also be classified into single-phase and two-phase cooling. Two-phase cooling generally exhibits greater effectiveness, but it also presents challenges such as regulations regarding two-phase immersion cooling fluids based on perfluoroalkyl and polyfluoroalkyl substances (PFAS), mechanical strength requirements for fluid containers to withstand increased pressure during phase changes, fluid loss due to vaporization, and high maintenance costs. This report provides an analysis of different liquid cooling vendors, coolant fluid suppliers, and data center end-users, offering insights into the opportunities and threats associated with single-phase and two-phase direct-to-chip/cold plate and immersion cooling.
 
 
Source: Benchmarking analysis of cooling solutions for data centers. Source: IDTechEx
 
IDTechEx anticipates rapid growth in the adoption of liquid cooling, driven by factors such as the increasing power capacity of data centers, the rise of hyperscale data centers, and the availability of ready-to-use liquid cooling solutions. Specifically, cold plate cooling is expected to experience the largest growth due to its cost effectiveness and compatibility with existing air-cooled data centers, eliminating the need for extensive retrofitting to accommodate immersion cooling solutions.
In line with these projections, this report offers a detailed 10-year revenue forecast for hardware related to liquid cooling in data centers segmented by different tiers of data center IT power capacity.
 
Market Opportunities
With greater adoption of liquid cooling, new opportunities are emerging, leading to strengthened collaborations among companies involved in the data center cooling supply chain. Component suppliers such as coolant distribution units (CDUs) vendors, pump vendors, and coolant fluid suppliers are expected benefit from the increased adoption of liquid cooling. CDUs and pumps are critical components for controlling the flow rate in liquid cooling systems. Factors such as material compatibility and pressure drop need to be carefully considered. This report introduces various commercial in-rack and in-row CDUs, accompanied by a comprehensive comparison of coolant fluids based on their dynamic viscosity, density, specific heat, thermal conductivity, as well as required pipe length and pipe diameter. The report also provides a forecast of CDUs and pumps in the data center industry for the next 10 years.
Key aspects of this report
 
Air Cooling
Direct-to-Chip/Cold Plate Cooling
Loop Heat Pipe
Single-Phase Immersion
Two-Phase Immersion
Coolant Comparison and Regulations
Coolant Distribution Units (CDUs) and Pumps
Thermal Interface Materials
10 year market size forecast of data center cooling from 2023 to 2043
10 year unit sales of CDUs used in data center.
Report MetricsDetails
Historic Data2016 - 2022
CAGRData Center Cooling Market Exhibits 13.4% CAGR over the next 10 years.
Forecast Period2023 - 2033
Forecast UnitsUS$, m{^2}, units
Regions CoveredWorldwide
Segments CoveredHyperscale Data Centers, Air Cooling (Fan Walls, Rear Door Heat Exchanger, Cooling Tower and Chiller), Single-Phase Direct-to-Chip/Cold Plate Cooling, Two-Phase Loop heat Pipe, Single-Phase Immersion Cooling, Two-Phase Immersion Cooling, Coolant, Heat Reuse, Coolant Distribution Units (CDUs), Thermal Interface Materials (TIMs).
Analyst access from IDTechEx
All report purchases include up to 30 minutes telephone time with an expert analyst who will help you link key findings in the report to the business issues you're addressing. This needs to be used within three months of purchasing the report.
Further information
If you have any questions about this report, please do not hesitate to contact our report team at research@IDTechEx.com or call one of our sales managers:

AMERICAS (USA): +1 617 577 7890
ASIA (Japan): +81 3 3216 7209
EUROPE (UK) +44 1223 812300
Table of Contents
1.EXECUTIVE SUMMARY
1.1.Trend of Thermal Design Power (TDP) of GPUs
1.2.Cooling Methods Overview
1.3.Yearly Revenue Forecast by Cooling Approach: 2016-2033
1.4.Air Cooling
1.5.Yearly Revenue of Air Cooling Forecast by Data Center Capacity: 2016-2033
1.6.Liquid Cooling - Direct-to-Chip/Cold Plate and Immersion
1.7.Liquid Cooling - Single-Phase and Two-Phase
1.8.Cold Plate/Direct-to-Chip Cooling Revenue Forecast: 2016-2033
1.9.Immersion Cooling Revenue Forecast 2016-2033
1.10.Component Selection For Liquid Cooling
1.11.Coolant Comparison
1.12.Coolant Comparison - PFAS Regulations
1.13.Coolant Distribution Units (CDU)
1.14.Yearly CDU Sales Number Forecast: 2017-2033
1.15.Heat Transfer - Thermal Interface Materials (TIMs) (1)
1.16.Heat Transfer - Thermal Interface Materials (TIMs) (2)
1.17.Yearly TIM Area Forecast by Data Center Component: 2021-2034
2.INTRODUCTION
2.1.Data Center Overview
2.1.1.Introduction to Data Centers
2.1.2.Data Center Demographics
2.1.3.Data Center Equipment - Top Level Overview
2.1.4.Data Center Server Rack and Server Structure
2.1.5.Power Use Effectiveness
2.1.6.Data Center Switch Topology - Three Layer and Spine-Leaf Architecture
2.1.7.K-ary Fat Tree Topology
2.2.Thermal Management for Data Centers Overview
2.2.1.Thermal Management Needs for Data Centers
2.2.2.Significant Consequences for Data Center Downtime
2.2.3.Data Center Location Choice
2.2.4.Increasing Thermal Design Power (TDP) Drives More Efficient Thermal Management
2.2.5.Overview of Thermal Management Methods for Data Centers
2.2.6.Thermal Management Categorization
3.THERMAL DESIGN POWER (TDP) EVOLUTION
3.1.TDP Increases Over Time - GPU
3.2.Server Boards TDP Increases - Moore's Law
4.THERMAL MANAGEMENT METHODS
4.1.Overview
4.1.1.Introduction to Data Center Cooling Classification
4.1.2.Cooling Technology Comparison (1)
4.1.3.Cooling Technology Comparison (2)
4.1.4.Air Cooling
4.1.5.Hybrid Liquid-to-Air Cooling
4.1.6.Hybrid Liquid-to-Liquid Cooling
4.1.7.Hybrid Liquid-to-Refrigerant Cooling
4.1.8.Hybrid Refrigerant-to-Refrigerant Cooling
4.1.9.Server Board Number Forecast - Methodology (1)
4.1.10.Server Board Number - Methodology
4.1.11.Data Center Power Forecast
4.1.12.Data Center Number
4.1.13.Cooling Categorization By Data Center Capacity (1)
4.1.14.Cooling Categorization By Data Center Capacity (2)
4.1.15.Data Center Number Forecast by Capacity: 2016-2033
4.2.Air Cooling
4.2.1.Introduction to Air Cooling (1)
4.2.2.Introduction to Air Cooling (2)
4.2.3.Benefits and Drawbacks of Air-Cooling Methods
4.2.4.Use Case: Row-Level Cooling Liebert® CRV CRD25
4.2.5.Overview: RDHx
4.2.6.Hybrid Air-to-Liquid Cooling - nVent
4.2.7.Cooling Tower - Adiabatic Cooling
4.2.8.Balance Between Water Use and Power Use - Case by Case in Practice
4.2.9.Use Case: Jaeggi - Adiabatic and Hybrid Dry Coolers
4.2.10.Trend for Air Cooling in Data Centers
4.2.11.SWOT of Air Cooling
4.3.Forecasts
4.3.1.Forecast - Air-Cooled Data Center - Hyperscale
4.3.2.Air-Cooled Data Centers Number Forecast: 2016-2033
4.3.3.TCO Comparison
4.3.4.Power Distribution for Data Center with Capacity <20MW
4.3.5.Air-Cooled Data Center Power Forecast 2016-2033
4.3.6.Data Centers Using Air Cooling Forecast 2016-2033
4.3.7.Data Centers Using Air Cooling Forecast by Data Center Power 2016-2033
4.3.8.Air Cooling Yearly Revenue Forecast 2016-2033
4.3.9.Data Centers with Air Cooling Only Revenue Forecast 2016-2033
4.4.Liquid cooling
4.4.1.Liquid Cooling and Immersion Cooling
4.4.2.Comparison of Liquid Cooling Technologies (1)
4.4.3.Comparison of Liquid Cooling Technologies (2)
4.5.Cold Plates
4.5.1.Overview
4.5.2.Single-Phase Cold Plate
4.5.3.Two-Phase Cold Plate
4.5.4.Cold Plate Forecast
4.5.5.Summary of Cold Plate Cooling
4.6.Spray Cooling
4.6.1.Introduction to Spray Cooling
4.6.2.Advanced Liquid Cooling Technologies (ALCT) - Spray Cooling
4.7.Immersion Cooling
4.7.1.Single-Phase and Two-Phase Immersion - Overview (1)
4.7.2.Single-Phase Immersion Cooling (2)
4.7.3.SWOT: Single-Phase Immersion Cooling
4.7.4.Use Case: Iceotope
4.7.5.Use Case: Green Revolution Cooling (GRC)
4.7.6.Overview: Two-Phase Immersion Cooling
4.7.7.SWOT: Two-Phase Immersion Cooling
4.7.8.Wieland - Two-Phase Immersion Cooling
4.7.9.Two-Phase Cooling - Phase Out Before Starting to Take Off?
4.7.10.Roadmap of Two-Phase Immersion Cooling
4.7.11.Roadmap of Single-Phase Immersion Cooling
4.7.12.Examples: Immersion
4.7.13.Use-Case: Iceotope and Meta
4.7.14.Use-Case: Microsoft
4.7.15.Asperitas
4.7.16.Gigabyte
4.7.17.Summary (1) - Benefits of Immersion Cooling
4.7.18.Summary (2) - Challenges of Immersion Cooling
4.7.19.Cost Saving Comparison - Immersion and Air Cooling
4.7.20.Comparison of Liquid Cooling Methods
4.7.21.Pricing of Direct-to-Chip, Immersion and Air Cooling - US$/Watt
4.7.22.Total Cost of Ownership (TCO) Comparison: Air Cooling and Immersion (1)
4.7.23.Total Cost of Ownership (TCO) Comparison: Air Cooling and Immersion (2)
4.7.24.Forecast of Server Boards - Capacity>143MW: 2016-2033
4.7.25.Immersion Yearly Revenue Forecast - Capacity>143MW: 2016-2033
4.7.26.Immersion Cumulative Revenue Forecast - Capacity>143MW: 2016-2033
4.8.Coolant
4.8.1.Introduction to Cooling Fluid
4.8.2.Coolant Fluid Comparison
4.8.3.Trend - Decline in Fluorinated Chemicals?
4.8.4.Immersion Coolant Liquid Suppliers
4.8.5.Engineered Fluids - Why Better Than Oils
4.8.6.What is the Roadmap for Coolant in Two-Phase Immersion?
4.8.7.Demand for Immersion Coolant Standardization - FOMs
4.8.8.Figures of Merit (FOM)
4.8.9.Force Convection FOM for Single-Phase Immersion
4.8.10.FOM3 - Viscosity for Pressure Drop
4.8.11.Density
4.9.Partnerships
4.9.1.Intel and Submer - Heat Reuse and Immersion Cooling
4.9.2.Iceotope, Intel and HPE
4.9.3.Iceotope, Schneider Electric, and Avnet - Liquid Cooled Data Center
4.9.4.GRC and Intel
4.9.5.GRC and Dell - Edge Deployment
4.9.6.Iceotope and Meta
4.9.7.Development of New Immersion Coolant - ElectroSafe
5.COOLANT DISTRIBUTION UNITS (CDUS)
5.1.Introduction
5.1.1.Overview - (1)
5.1.2.Overview - (2)
5.1.3.Redundancy - (1)
5.1.4.Redundancy - (2)
5.1.5.Overview of CDU - Teardown
5.1.6.Liquid-to-Liquid (also known as L2L) CDUs
5.1.7.Liquid-to-Air CDUs
5.1.8.Summary of Liquid-to-Liquid and Liquid-to-Air Cooling
5.1.9.Vertiv - Liebert® XDU 60 Heat Exchanger and CDU - (1)
5.1.10.Vertiv - Liebert® XDU Heat Exchanger and CDU - (2)
5.1.11.CDU - nVent
5.1.12.CDU - CoolIT - Teardown (1)
5.1.13.CDU - CoolIT - Teardown (2)
5.1.14.CDU - CoolIT - Teardown (3)
5.1.15.CDU Teardown - Motivair
5.1.16.Yearly CDUs Sales Number Forecast: 2017-2033
5.2.Main Pump
5.2.1.Overview
5.2.2.Yearly Pumps Sales Number Forecast: 2017-2033
5.3.Filtering
5.3.1.Overview
5.3.2.Filters - Schematic Drawing
5.3.3.Filters
5.4.Sensors
5.4.1.Overview of Sensors
5.4.2.Leakage Detection Sensors - Overview
5.4.3.Leakage Detection Sensors on Server Nodes (1)
5.4.4.Leakage Detection Sensors on Server Nodes (2)
5.5.Heat Reuse
5.5.1.Overview of the Heat Reuse in Data Center Cooling
5.5.2.Use Case: Amazon Data Center Heat Reuse
5.5.3.Facebook (Now Meta) Data Center Heat Reuse
5.5.4.Tencent - Tianjin Data Center Heat For Municipal Heating
5.5.5.Return on Investment of Heat Reuse
5.5.6.More Examples of Heat Reuse
6.HEAT TRANSFER - THERMAL INTERFACE MATERIALS (TIMS)
6.1.Overview
6.1.1.Thermal Interface Materials in Data Centers
6.1.2.Common Types of TIMs in Data Centers - Line Card Level
6.1.3.TIMs in Data Centers - Line Card Level - Transceivers
6.1.4.TIMs in Server Boards
6.1.5.Server Board Layout
6.1.6.TIMs for Data Center - Server Boards, Switches and Routers
6.1.7.Data Center Switch Players
6.1.8.How TIMs are Used in Data Center Switches - FS N8560-32C 32x 100GbE Switch
6.1.9.WS-SUP720 Supervisor 720 Module
6.1.10.Ubiquiti UniFi USW-Leaf Switch
6.1.11.FS S5850-48S6Q 48x 10GbE and 6x 40GbE Switch
6.1.12.Cisco Nexus 7700 Supervisor 2E module
6.1.13.TIMs for Power Supply Converters (1): AC-DC and DC-DC
6.1.14.Data Center Power Supply System
6.1.15.TIMs for Data Center Power Supplies (2)
6.1.16.TIMs for Data Center Power Supplies (3)
6.1.17.TIMs in Data Center Power Supplies (4)
6.1.18.How TIMs are Used in Data Center Power Supplies (5)
6.1.19.How TIMs are Used in data center power supply (6)
6.1.20.TIMs for Data Centers - Power Supply Converters
6.1.21.Differences Between TIM Forms - (1)
6.1.22.Differences Between TIM Forms - (2)
6.1.23.Novel material - Laminar Metal Form with High Softness (1)
6.1.24.Novel material - Laminar Metal Form with High Softness (2)
6.1.25.TIM Trends in Data Centers
6.1.26.Estimating the TIM Areas in Server Boards
6.1.27.Servers Number Forecast: 2021-2034
6.1.28.Total TIM Area in Server Boards Forecast: 2021-2034
6.1.29.Data Center Switches Number Forecast: 2021-2034
6.1.30.Area of TIM per Switch
6.1.31.TIM Area for Leaf and Spine Switch
6.1.32.TIM Area for Leaf and Spine Switch Forecast: 2021-2034
6.1.33.TIM Consumption in Data Center Power Supplies
6.1.34.Number of Power Supplies Forecast and TIM Area Forecast: 2021-2034
6.1.35.Forecast summary - TIM Area for Different Data Center Components: 2021-2034
6.2.Summary
6.2.1.Cooling Methods Cumulative Revenue Forecast 2016-2033
6.2.2.Cooling Methods Yearly Revenue Forecast 2017-2033
6.2.3.Number of Air-Cooled Data Centers Forecast: 2016-2033
6.2.4.Hyperscale - Air + Liquid Cooling Forecast 2016-2033
6.2.5.Power Forecast for Data Centers with Power < 20MW
6.2.6.Air Cooling Revenue Forecast: 2016-2033
 

Ordering Information

Wärmemanagement für Rechenzentren 2023-2033

£$¥
Electronic (1-5 users)
£5,650.00
Electronic (6-10 users)
£8,050.00
Electronic and 1 Hardcopy (1-5 users)
£6,450.00
Electronic and 1 Hardcopy (6-10 users)
£8,850.00
Electronic (1-5 users)
€6,400.00
Electronic (6-10 users)
€9,100.00
Electronic and 1 Hardcopy (1-5 users)
€7,310.00
Electronic and 1 Hardcopy (6-10 users)
€10,010.00
Electronic (1-5 users)
$7,000.00
Electronic (6-10 users)
$10,000.00
Electronic and 1 Hardcopy (1-5 users)
$7,975.00
Electronic and 1 Hardcopy (6-10 users)
$10,975.00
Electronic (1-5 users)
¥990,000
Electronic (6-10 users)
¥1,406,000
Electronic and 1 Hardcopy (1-5 users)
¥1,140,000
Electronic and 1 Hardcopy (6-10 users)
¥1,556,000
Electronic (1-5 users)
元50,000.00
Electronic (6-10 users)
元72,000.00
Electronic and 1 Hardcopy (1-5 users)
元58,000.00
Electronic and 1 Hardcopy (6-10 users)
元80,000.00
Click here to enquire about additional licenses.
If you are a reseller/distributor please contact us before ordering.
お問合せ、見積および請求書が必要な方はm.murakoshi@idtechex.com までご連絡ください。

Report Statistics

Slides 256
Forecasts to 2033
ISBN 9781915514721
 

Preview Content

pdf Document Webinar Slides - EOY 2023
pdf Document Webinar Slides
pdf Document Sample pages
 
 
 
 

Subscription Enquiry