Now Reading
Here’s One Way to Make Server Farms Suck Less Power – Mother Jones
[vc_row thb_full_width=”true” thb_row_padding=”true” thb_column_padding=”true” css=”.vc_custom_1608290870297{background-color: #ffffff !important;}”][vc_column][vc_row_inner][vc_column_inner][vc_empty_space height=”20px”][thb_postcarousel style=”style3″ navigation=”true” infinite=”” source=”size:6|post_type:post”][vc_empty_space height=”20px”][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row]

Here’s One Way to Make Server Farms Suck Less Power – Mother Jones

Here’s One Way to Make Server Farms Suck Less Power – Mother Jones


Getty Images.

This story was first published by Wired and is reproduced here as part of the Climate Desk collaboration.

For all of its Faults—and there There are many—the electrical grid in the United States is a miracle worker: If you flip a switch the lights come on, almost without fail. As renewables like wind and solar replace fossil fuels, this miracle work is no longer possible. It gets a little more difficult because sunlight and wind aren’t always available. Navigating this intermittency, as it’s known among energy geeks, demands a fundamental rethink of how consumers use and even help store energy. For example, electric vehicle drivers may one day use their cars as a means of transporting their goods. A vast network of batteries that grid operators can tap into when renewables wane.

Another option might be to use information as batteries—of a sort. Researchers have suggested that companies precompute certain data before the grid is powered by solar or wind power and store it for later use. Although the team dubbed the concept “information batteries,” don’t take “battery” to mean a physical device. This is digital, more of a timing strategy than a real battery, aimed at getting data-hungry companies like Google, Meta, Amazon, Apple, and Netflix to use clean power when it’s plentiful so utilities can avoid burning fossil fuels when it’s not.

Jennifer Switzer, University of California San Diego computer science researcher, believes this kind of power consumption is quite flexible. “You can’t charge your car unless the battery has discharged at least a little bit, and you can’t wash your clothes until your clothes are dirty,” says Switzer, one of the researchers who proposed the idea in a Paper published earlier this month. “But with computing, if you have some way of predicting, with even a small amount of accuracy, what you’re going to need in the future, then you can compute results before you actually need them and store those results. Instead of storing energy to use later, you’re storing data.”

This is a new idea, so it hasn’t been deployed in the real world, but it has plenty of potential use cases. Tech companies need to analyze all kinds of data. Google creates its search results, and YouTube converts videos to different qualities for you. Facebook must recommend friends, and Amazon must recommend products. A lot of this processing is done on-demand. These researchers believe that some of this processing work could be done asynchronously if green energy is flowing into a grid.

The information battery concept is a bit like a Post Office. It knows how many letters it can expect to deliver on a given date, but not the specific letter that a carrier will need in order to reach your home. The Post Office must use energy to perform some maintenance tasks (like powering up sorting centres) and to enable the more unpredictable ones (like delivering letters to a specific address). Similarly, if tech companies can crunch through routine data tasks when renewables are available, the intermittency of those energy sources won’t be as much of an issue when it comes to on-demand calculations later. “The core concept here is that information has an embodied energy to it,” says University of Southern California computer scientist Barath Raghavan, who coauthored the paper with Switzer. “Information batteries are going to work well where things are highly predictable. You get that in the case of video encoding, movie rendering, graphics work.” 

Google will process your request as soon as you type in a search term. Some of that work can’t be done in advance because your exact request is unpredictable. (Google can’t read your mind—at least, not yet.) The search tool’s foundation relies on a lot more than just rote computation. This is because it uses a lot of energy and is done in huge data centers. Chunks of that sort of computation are done well before you hit “I’m Feeling Lucky.” Or consider the computational power needed to supply streaming videos. When processing video files, says Switzer, “if you know that there’s going to be a lot of Netflix traffic at a certain time of day, you could do that ahead of time and have that ready for some popular shows and movies, even if not all those are actually requested.”

The most likely candidates for testing out this concept are companies that operate enormous data centers, because the demand for energy-intensive computing power—known as compute—is soaring. “The planetary scale for compute is going up dramatically, and I think that you’re going to see major providers like Amazon, Microsoft, Google, Facebook sourcing most or even all of their energy from renewables,” says George Porter, who is codirector of the Center for Networked Systems at the University of California San Diego and collaborates with Raghavan and Switzer, but wasn’t involved in the new paper. “And so in that particular case, I think managing this intermittency issue is going to be kind of a challenge.” 

Raghavan believes this technique may also work for less-intensive energy users, such as climate modeling researchers. “You can predict some fraction of the subtasks within that macro computational task,” he says. “And if you can get decent accuracy on those subtasks, you can precompute them and then later you can go and retrieve the results.” (It wouldn’t work as well for Weather models, since that’s more immediate and less predictable.)

This type of batch processing has been used for a long period of time in various contexts. At night, for instance, there’s less competition for computing power in data centers because most people aren’t working then. So people who have to run a complex program might schedule it to run overnight, says Kurtis Heimerl, a computer scientist at the University of Washington who wasn’t involved in the proposal. “The really interesting thing in this paper is really moving that into an energy-conservation space, rather than usually just conserving computing resources in general,” he says.

“I think it’s really innovative,” says Porter. “It’s actually shifting work from the future to now. It’s just like at a restaurant, where they know how many pies they sell a day. So they’ll make a whole bunch in the morning, rather than on demand, and it’s more efficient and convenient that way.”

But like a restaurant staff, information battery users can’t predict the future—just Parts of the future. A baker might know roughly how many pies they’ll sell, but not the age, height, weight, and socioeconomic status of the customers who will buy them. That’s asking too much. That’s asking too much from a tech company that they pre-crunch and then sell. All the data it’ll need tomorrow. The trick—and this is where Switzer and Raghavan are taking their research—is developing tools that isolate patches of data that are a good fit for precomputation. This would require understanding the needs of each company or scientific modeling team. 

Still, information batteries won’t be for everyone. “Where the information battery starts to not work as well is going to be in cases where the tasks are too small, where it’s not worth it from an energy perspective to precompute and store it,” says Raghavan. “It’s not going to always work—it’s not going to always give you high efficiency. But sometimes it will give you good efficiency.”

One advantage to starting with global tech companies, which have the most influence on grid load, is that these companies can also shift computational jobs around the world. When it’s daylight in Europe, data centers there could take up more work. When the sun goes down they can turn that work over to a data centre in the western US where the day is just starting. Google’s Already doing this?When carbon-free energy is available locally, you can shift tasks between data centers. 

It’s likely to be more cost-effective than spending huge amounts on massive battery arrays to store sun and wind power to get tech companies to match their demand. It’s no doubt simpler than convincing millions of individuals to Time their home energy usage. It could happen long before there’s a fleet of electric vehicles big enough to serve as a Distributed backup power supply. It could be a sweet relief for an already-ill grid.


View Comments (0)

Leave a Reply

Your email address will not be published.