Your Netflix binging isn’t killing the planet—yet

inside a data center
inside a data center

The internet lives in data centers: warehouses of computers, some vast, that use a lot of energy to keep the equipment running and to keep it from overheating. So what’s the carbon footprint of a Google search or Netflix binge session?

Individually, probably not too much: In 2018, Google estimated that one month of a typical individual’s emailing and searching adds up to about the same greenhouse gas emissions as driving a car one mile. But with global demand for data skyrocketing, that could really add up: One high-profile 2017 estimate projected that by this year, data centers could be responsible for more emissions than the aviation or shipping industries—and that their footprint could grow to more than one-fifth of total global energy use by 2030.

Those dire estimates are likely too high, according to a new paper in Science that claims to present the most detailed analysis to date. What it finds is a mixed bag. On one hand, the paper reports, strides in data center efficiency have mostly kept pace with growing demand for data, meaning that in the last decade the total amount of energy consumed by the centers has not changed much—around 1% of global energy use. That’s about the same as 18 million US homes.

On the other hand, it’s clear that we’re approaching a limit to squeezing out more efficiency—especially given the rise of data-ravenous artificial intelligence.

The energy demand of data centers is notoriously hard to pin down, since information on how many data centers there are and how hard they’re working is often closely held. And it’s a moving target, since the efficiency of data center computers—how many calculations they can perform per unit of energy—is always improving.

Eric Masanet, a mechanical engineer at Northwestern University and the paper’s lead author, said some of the more extreme estimates of data center energy demand to date—the 2017 study above, and a study last fall that equated 30 minutes of Netflix streaming to driving four miles—have relied on outdated assumptions about server efficiency.

So using a patchwork of market data, including sales of servers, Masanet and his colleagues deduced the current makeup of data center computing capacity, with an updated measure of each server’s energy needs. Between 2010 and 2018, they found the number of servers around the world rose 26-fold, and that traffic to those centers increased 6.5-fold.

But in that same timeframe, total energy use only rose about 6%, they found. That translates to efficiency gains of about 20% per year—many times greater than efficiency gains in aviation or other industrial sectors during that time, Masanet said.

In addition to garden-variety, Moore’s Law-style computing efficiency gains, the other big driver of these improvements is an industry-wide shift from smaller, widely-distributed data centers to giant, “hyperscale” centers run by Google, Amazon, and other big cloud computing providers. These centers tend to work much more efficiently than smaller ones, Masanet said, and sometimes use innovative cooling methods, like a Google center in Finland that pipes in seawater.

So far, so good. But those efficiency gains have limits, while demand grows faster all the time. According to Dale Sartor, a data center engineer at Lawrence Berkeley National Laboratory, many of the hyperscale data centers are already about as efficient as they can be, meaning that almost all the energy they consume goes directly into performing calculations.

“The next doubling in demand for data service will occur in three to five years,” Masanet said. “Can we expect to enjoy a continuation of this efficiency? The answer is yes. So that can buy us half a decade. But beyond that, later this decade, it’s likely the growing demand for data will outpace the ability of current technologies to offset that growth.”

There’s still debate about how quickly energy demand will increase after that point. Anders Andrae, an information systems sustainability analyst with the Chinese tech giant Huawei, who authored the 2017 study and reviewed Masanet’s, said Masanet likely underestimates the impact of AI on the trajectory of data demand and energy use. Compared to streaming a video or running a web search, operating an autonomous vehicle or processing a video feed for facial recognition requires orders of magnitude more computations, requiring a data center server to dedicate more time and effort (and thus more energy) to a single task.

Howie Huang, an expert in data-intensive computing at George Washington University who was not involved with the study, agreed: “I cannot help but notice that AI is the one tricky trend they didn’t take into consideration,” he said.

Masanet’s study also excludes the specialized data centers that mine cryptocurrency. According to a Cambridge University analysis, global energy demand for crypto mining is a little less than half of what Masanet calculates for all other data centers. Those outfits are more worrisome from a climate perspective, he said, because their efficiency gains tend to be offset by the increasing computational difficulty of solving the math problems that unlock coins.

Once data centers run out of room for efficiency improvements, the next step will be to boost their reliance on renewable energy. Last month, Google announced plans to build a $1 billion solar farm to feed a new data center in Nevada, and Microsoft claims to be on track to get 60% of its data center power from renewables this year.

Still, Masanet said, “the fact that we stream videos instead of driving to the video store is a step in the right direction. Even if the data center carbon footprint starts going up, digital services are nearly always more efficient than the physical services they replace.”

 

Sign up for the Quartz Daily Brief, our free daily newsletter with the world’s most important and interesting news.

More stories from Quartz:

Advertisement