August 12, 2022

Data Centers Are Facing a Climate Crisis


The problem? That data is historical and represents a time when temperatures in the UK didn’t hit 40 degrees Celsius. “We’re on the fringes of a changing climate,” says Harris.

“It wasn’t that long ago that we were designing cooling systems for a peak outdoor temperature of 32 degrees,” says Jon Healy, of the UK data center consultancy Keysource. “They’re over 8 degrees higher than they were ever designed for.” The design conditions are being increasingly elevated—but data center companies, and the clients they’re working for, operate as profit-driven enterprises. Data from consultancy Turner & Townsend suggests that the cost of building data centers has risen in almost every market in recent years, and construction companies are advised to keep costs down.

“If we went from 32 degrees to 42 degrees, blimey,” says Healy. “You’re having to make everything significantly larger to support that very small percentage of the year” when temperatures rise. “It’s got to be done with caution.”

Data center design companies are starting to consider the historical weather information outmoded and beginning to use projected future temperatures, says Flucker. “Rather than thinking my extreme is 35 degrees, they’re doing projections saying maybe it’s more like 37 or 38 degrees,” she says. “But of course, that’s only as good as how well we can predict the future.”

Flucker points out that data centers rarely operate at full capacity—although Cushman & Wakefield research shows that eight data center markets worldwide out of 55 they investigated operate at 95 percent or higher capacity—and at present, they’re only strained at the highest temperatures for a small number of days a year. Data centers that don’t operate at 100 percent capacity can cope better with high external temperatures because equipment failure is less likely to have an all-or-nothing impact on performance. But that will almost certainly change as the climate emergency begins to permanently alter our environmental temperatures and the margin for error narrows.

The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) has developed operating temperature guidelines for data processing equipment, such as the servers integral to data centers. The limits suggest air pumped through data centers be supplied at no more than 27 degrees Celsius—though there are allowable ranges beyond that. “The world doesn’t end,” says Flucker. “All this equipment is warrantied up to 32 degrees Celsius.” But with temperatures continuing to rise, data centers need to make changes.

“There are a deceptively large number of legacy data center sites built by banks and financial services companies needing to be refreshed and refitted,” says Harris. As part of that rethink, Harris advises companies to look at design criteria that can cope with climate change, rather than solely minimizing its effects. “It’ll be bigger chiller machines, machines with bigger condensers, and looking more at machines that use evaporative cooling to achieve the performance criteria needed to ensure that for those days things are still in a good place,” he says.

Companies are testing some unusual ways to tackle these challenges: Between 2018 and 2020 Microsoft ran Project Natick, which sunk a data center 117 feet below the sea offshore Scotland to insulate it from temperature fluctuations, among other things. Harris says that building data centers in ever more northern climates could be one way to avoid the heat—by trying to outrun it—but this comes with its own problems. “Developers will be fighting over an ever-dwindling pool of potential sites,” he says, a challenge when edge computing puts data centers ever closer to the point at which data is consumed, often in hotter, urban areas.

 Liquid cooling technology offers a more practical solution. Data centers are currently in an era of air-based cooling, but liquid cooling—where liquid is passed by equipment, transferring the heat and syphoning it away—could be a better way to keep temperatures down. However, it isn’t widely used because it requires a combined knowledge of cooling and IT equipment. “At the moment, these are two very separate worlds,” says Flucker. “There’s definitely some apprehension about making such a big change in how we do things.”

But it may well be necessary—not least because it sets up a virtuous circle. Outside of the IT equipment itself, the next-biggest consumer of energy in data centers is the equipment used to keep it cool. “If we can move away from the traditional way of doing things,” says Flucker, “it’s preventing climate change in the first place.”

Leave a Reply

Your email address will not be published.