The potential of artificial intelligence is immense. But it could be derailed due to a lack of power. Data centers implementing AI now need so much energy that the local grid often struggles to supply it.
Drew Robb, writing for TechRepublic Premium, looks at the extent of the problem and how it can be overcome.
Featured text from the download:
RACKS ARE GETTING DENSER
The racks that line the aisles of data centers have been getting denser. This trend is accelerating. According to Uptime Institute, the power density per rack in 2010 was in the 4–5 kW range. By 2020, it had reached 8–10 kW per rack and 25% of data centers were reporting they had deployed some racks with densities in the 20–29 kW range by 2022. Some had racks of 50 kW or greater. Two years later, densities have surged again. Racks filled with NVIDIA NVL72 units can exceed 120 kW.
This is a serious problem for many data centers. While they can build racks filled with the latest GPUs, their electrical infrastructure is not designed to provide the amount of power required. Thus, a complete rearchitecting of the entire power infrastructure could be required to support AI use cases.
But the internal power capacity is only half the story. The local grid may be woefully inadequate when it comes to satisfying the insatiable demands of the AI data center.
Enhance your AI knowledge with our in-depth 10-page PDF. This is available for download at just $9. Alternatively, enjoy complimentary access with a Premium annual subscription. Click here to find out more.
TIME SAVED: Crafting this content required 20 hours of dedicated writing, editing, research, and design.
Be First to Comment