Stopping the Cardinal Sin of Colocation Rack Power Sizing
In 2024, organizations are overprovisioning power for colocation racks by using outdated methods. Accurate assessment of actual power usage is recommended to reduce costs and improve power management.
Read original articleIn 2024, many organizations continue to improperly size power requirements for colocation racks, relying on outdated methods that use power supply wattage instead of actual power consumption. This practice leads to significant overprovisioning, where companies pay for more power than they actually use. For instance, a typical 208V 30A rack can theoretically provide up to 5kW, but modern servers often operate at much lower power levels, sometimes only utilizing a fraction of their power supply capacity.
The article highlights examples, such as a Supermicro server that, despite having power supplies rated for 1.2kW, only consumes around 750W even under stress tests. This results in unnecessary costs, as organizations may end up paying for 4kW to 16kW of unused power. The trend is particularly pronounced in AI servers, which can have substantial power supply capacities but often do not operate at full load.
To optimize costs, it is recommended that organizations assess their actual power usage rather than relying on maximum power supply ratings. This approach can lead to significant savings and better power management, especially as high-power servers become more common. The article calls for a shift in mindset to avoid the "cardinal sin" of colocation rack power sizing, advocating for a more accurate assessment of power needs to reduce unnecessary expenses.
Related
AI Is Already Wreaking Havoc on Global Power Systems
AI's rapid growth strains global power grids as data centers expand to meet energy demands. Major tech firms aim for green energy, but challenges persist in balancing AI's energy hunger with sustainability goals.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's impact on energy consumption in data centers is debated. Current data shows AI's energy use is a fraction of overall consumption, with potential growth by 2027. Efforts to enhance efficiency are crucial.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.
From Cloud Chaos to FreeBSD Efficiency
A client shifted from expensive Kubernetes setups on AWS and GCP to cost-effective FreeBSD jails and VMs, improving control, cost savings, and performance. Real-world tests favored FreeBSD over cloud solutions, emphasizing efficient resource management.
Power-hungry data centres scrambling to find enough electricity to meet demand
Australia's data centres may consume up to 15% of the national grid by 2030, driven by cloud computing and AI, prompting a need for renewable energy and immediate action to ensure grid stability.
Related
AI Is Already Wreaking Havoc on Global Power Systems
AI's rapid growth strains global power grids as data centers expand to meet energy demands. Major tech firms aim for green energy, but challenges persist in balancing AI's energy hunger with sustainability goals.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's impact on energy consumption in data centers is debated. Current data shows AI's energy use is a fraction of overall consumption, with potential growth by 2027. Efforts to enhance efficiency are crucial.
Taking a closer look at AI's supposed energy apocalypse
Artificial intelligence's energy impact, particularly in data centers, is debated. AI's energy demands are significant but only a fraction of overall data center consumption. Efforts to enhance AI cost efficiency could mitigate energy use concerns.
From Cloud Chaos to FreeBSD Efficiency
A client shifted from expensive Kubernetes setups on AWS and GCP to cost-effective FreeBSD jails and VMs, improving control, cost savings, and performance. Real-world tests favored FreeBSD over cloud solutions, emphasizing efficient resource management.
Power-hungry data centres scrambling to find enough electricity to meet demand
Australia's data centres may consume up to 15% of the national grid by 2030, driven by cloud computing and AI, prompting a need for renewable energy and immediate action to ensure grid stability.