U.S. facility may have best data center PUE

After years of big gains, the energy efficiency of data centers has stalled. The key data center efficiency measurement, power usage effectiveness, is not improving. It even declined a little from last year.

The reason may have to do with the limits of the technology in use by the majority of data centers.

Improving data center PUE, “will take a shift in technology,” said Chris Brown, chief technical officer of Uptime Institute LLC, a data center advisory group in Seattle. Most data centers — as many as 90% — continue to use air cooling, which isn’t as effective as water in removing heat, he said.

But one data center has made the shift in technology: the National Renewable Energy Laboratory (NREL) in Golden, Colo. The NREL’s mission is to work on advancing energy-related technologies, such as renewable power, sustainable transportation, building efficiency, grid modernization, among others. Its supercomputer data center deploys technologies that help it to achieve a very low data center PUE.

The technologies includes cold plates, which uses liquid to draw waste heat away from the CPU and memory. It also has a few rear door heat exchangers. A heat exchanger is fitted to the rear of server racks. It removes heat from the server when it interacts with water carrying coils that cool the air before it enters the data center room.

“The closer you get to rack in terms of picking up that waste heat and removing it, the more energy efficient you are going to be,” said David Sickinger, a researcher at NREL’s Computational Science Center.

Data center efficiency gains have stalled

NREL uses cooling towers to chill the water, which can be as warm as 75 degrees Fahrenheit and still cool the systems. The cooler and drier climate conditions of Colorado help. NREL doesn’t have mechanical cooling, which includes chillers. 

Because of the increasing power of high-performance computing (HPC) systems, “that has sort of forced the industry to be early adopters of warm water liquid cooling,” Sickinger said. 

The lowest possible data center PUE is 1, which means that all the power drawn goes to the IT equipment. NREL is reporting that its supercomputing data center PUE is 1.04 on an annualized basis. The NREL HPC data center has two supercomputers in a data center of approximately 10,000 square feet.

“We feel this is sort of world leading in terms of our PUE,” Sickinger said.

Is AI starting to reduce staffing needs?

Something else that NREL believes sets it apart is its reuse of the waste heat energy. The lab uses it to heat offices and for heating loops under outdoor patio areas to melt snow.

More than 10 years ago, the average PUE as reported by Uptime was 2.5. That has since improved. By 2012, the average data center PUE was 1.65. It continued to improve slightly but has since leveled off. In 2019, the average data center PUE ticked up to a PUE of nearly 1.7.

“I think as an industry we started to get to about the end of what we can do with the way we’re designing today,” Brown said. He believes in time data centers will look at different technologies, such as immersion cooling, which involves immersing IT equipment in a nonconductive liquid.

I think as an industry we started to get to about the end of what we can do with the way we’re designing today.
Chris BrownChief technical officer, Uptime Institute LLC

Improvements in data center PUE add up. If a data center has a PUE of 2, it is using 2 megawatts of power to support 1 megawatt of IT load. But if a data center can lower the PUE to 1.6, then 1.6 megawatts is being used by the facility, providing a savings of about 400 kilowatts of electrical energy, Brown said.

Data centers are becoming major users of electricity in the United States. They account for nearly 2% of all U.S. electrical use.

In a 2016 U.S. government-sponsored study, researchers reported that data centers in 2014 accounted for about 70 billion kWh and was forecasted to reach 73 billion kWh in 2020. This estimate has not been updated, according to energy research scientist Jonathan Koomey, who was one of the authors of the study.

Koomey, who works as an independent researcher, said it is unlikely the estimates in the 2016 report have been exceeded much, if at all. He’s involved in a new independent research effort to update those estimates.

NREL is working with Hewlett Packard Enterprise to develop AI algorithms specific to IT operations, also known as AIOps. The goal is to develop machine learning models and predictive capabilities to optimize the use of data centers and possibly inform development of data centers to serve exascale computing, said Kristin Munch, NREL manager of the data, analysis and visualization group.

National labs generally collect data on their computer and data center operations, but they may not keep this data for a long period of time. NREL has collected five years worth of data from its supercomputer and facilities, Munch said.

Go to Original Article
Author: