Optimizing Rack Densities for the AI-Driven Data Center

Optimizing Rack Densities for the AI-Driven Data Center

September 17, 2024 With experts saying it could potentially add up to $25tn to the world’s economy, AI has taken centre stage – and is reshaping how we live, work and interact. From machine learning to new capabilities like the arrival of generative AI (GenAI), and more specifically ChatGPT in 2022, powered by deep learning neural networks, this highly disruptive technology is poised to tackle complex challenges, streamline business, enhance customer experiences, and drive innovation and creativity.

USE IT OR LOSE IT
Tech companies and enterprise businesses are jumping on board, with Gartner stating that 30 per cent of organisations that fail to use AI will soon lose their economic vitality. AI adoption among organisations has already reached 72 per cent in 2024 and the market is expected to explode at a staggering rate of 36.6 per cent from 2024 to 2030.

By utilising AI, businesses, particularly those in the IT industry, can streamline even the most complex of operations, as well as manage workloads more efficiently – thereby enhancing cybersecurity with cutting-edge anomaly detection. AI can provide real-time predictive analysis in data centres to improve overall operational efficiency by up to 45 per cent by optimising power and space, leading to a 40 per cent reduction in cooling costs. The surge in AI adoption also creates demand for more powerful, affordable computer chips, fuelling innovation and new business models in the semiconductor industry.

SPACE RACE
It has been suggested that the total capacity of hyperscale data centres will nearly triple by 2030. During this time, enterprise businesses will be investing more in purpose-built AI models for specific use cases. As this happens, multi-tenant colocation and on-premise enterprise data centres will also require more capacity.

This capacity surge will undoubtedly place unprecedented demands on data centre infrastructure due to significantly higher rack densities and higher density fibre optic connections. At the same time, the rise in energy prices, pressures from regulatory bodies and corporate sustainability goals require data centres around the world to reduce their energy consumption and reduce their carbon footprints.

MEETING THE NEED
The great news is that data centres can cost-effectively meet the needs of highdensity AI environments when the right strategies and solutions are in place, while keeping things cool, both for both the housed equipment and the planet, by taking advantage of AI capabilities and using it themselves.

It is no secret that powerful AI processing increases rack power density and heat. Traditional rule-based AI models, such as voice assistants like Siri or Alexa, recommendation engines on platforms such as Netflix or Amazon Prime, and the search algorithms on Google, are examples of traditional rule-based AI models that identify patterns in historical data to make future predictions. These traditional rule-based models run smoothly on general purpose high-performance servers with central processing units (CPUs).

SKILLS DEVELOPMENT
Alternatively, GenAI leverages deep learning and neural networks that mimic the human brain to learn, solve problems, and generate innovative ideas and content. For the most part, GenAI operates in two phases – training and inference. An AI training model pulls massive amounts of existing data from various sources in parallel to learn a new skill, the most common being large language models (LLMs) created for text generation applications. It also forms the foundation models for image, video and audio generation.

An AI inference model puts learned skills to use, all while running on highperformance CPU servers that perform operations in sequence. However, these CPU servers simply aren’t powerful enough to handle the heavy workload of the training phase. This is where graphics processing units (GPUs) come in. A single GPU server can match the processing capabilities and consume up to 10 times more power than a dozen CPU servers. However, with these processing capabilities comes dramatically increased power consumption. Consider an AI training cluster with hundreds or thousands of interconnected GPUs and high-speed switches. The challenge is clear – exponentially higher rack power densities.

OPTIMISING DATA CENTRES
AI is pushing average rack power densities significantly higher. According to Uptime Institute’s 2022 Global Data Centre Survey, 25 per cent of enterprise data centres reported having rack densities greater than 20kW or more, and even some hyperscale data centres are reported to have reached even higher densities of 80kW or more. This trend is continuing to rise, with forecasts predicting an average rack density of 50kW by 2027.

The higher the rack densities, the higher the heat generation, which adds a greater pressure on the already sensitive challenge for data centre operators to cool equipment within the recommended operating temperature range. Exacerbating that challenge is ASHRAE’s latest Thermal Guidelines for Data Processing Environment that includes a new Class H1 for high-density systems. It narrows the recommended temperature range from 18-27°C (64- 81°F) to 18-22°C (64-72°F).

KNOWLEDGE IS POWER
With rack power densities accelerating, data centre managers able to utilise AI algorithms for operational analytics have the advantage of making more informed decisions, with a deeper understanding of operations, actionable insights and situational awareness. AI significantly enhances data centre infrastructure planning using its advanced analytical capabilities to anticipate future trends and requirements. AI forecasts capacity needs and identifies patterns in data, allowing data centre managers to future proof their infrastructure solutions against upcoming technology advancements.

Automated, routine tasks are covered by AI, meaning data centre managers can focus on more strategic aspects of their roles, thus improving workload efficiency. The shift from what was a human resource role to relying on AI’s proficiency in managing routine tasks propels operational efficiency, safeguarding the seamless continuity of data centre operations. The collaboration between AI and data centre operators is set to redefine industry standards, ensuring data centres not only survive but thrive in the face of evolving demands.

INTELLIGENT DESIGN
At a time when data centre operators are challenged with the goal to become more sustainable, AI applications are accelerating power consumption in data centres. However, AI is a trend which is being embraced within data centres, as it can provide the intelligence to design and operate data centres to maintain efficiency, reliability and sustainability, and strategically work towards helping the planet’s journey towards a net zero outcome. By combining the key attributes of data centre physical infrastructure with the efficiency gains of AI, owners, operators and end users can more effectively manage the power demands of high-density AI clusters, while operating in a smarter, more energy efficient way.

JON BARKER
Jon Barker is CPI’s technical manager for Europe. He has over 25 years in the engineering industry, with 17 years specialising in data centre infrastructure. As technical manager, Barker serves as a technical contact, accountable for resolving pre-sales and post-sales technical support questions and issues. He also provides support to CPI’s sales team by delivering product and technologybased presentations to customers, channel partners and industry event audiences.

This article was originally published in the October 2024 issue of Inside Networks.
Search CPI in the News
MEDIA CONTACT
Maren Price
Marketing Content Manager
Email: [email protected]

See all CPI Contacts

For complete media and press information, resources and more, visit the CPI Press Kit.