Published on February 5th, 2016 | by Joshua S Hill39
Data Centers Using 30% More Coal As Microsoft Takes Things Underwater
February 5th, 2016 by Joshua S Hill
Despite best intentions to go green, data centers have underestimated their coal usage by 30% or more, according to a new report.
Data-services companies like Apple, Google, Facebook, and Amazon are well known for their recent efforts to transition their entire operations to 100% renewable energy. Google especially has been making a strong push towards such an idealistic goal, just this past December purchasing 842 MW worth of renewable energy through Power Purchase Agreements to power their data centers.
Intelligence company Lux Research suggests that data centers around the world are using more than 90 billion kWh of electricity annually — enough to power New York City twice over.
However, the problem, as Lux Research finds it, is that data center companies are using outdated and obsolete data tools for calculating emissions in their purchases from the power grid, and are therefore underestimating the amount of electricity they are using that is sourced from coal.
Lux Research created a new analytical tool that is more up to date and precise than the outmoded method currently used by operators, who have previously had to rely on the US Environmental Protection Agency (EPA) Emissions & Generation Resource Integrated Database (eGRID) to estimate their emissions.
“Our team of data scientists analysed the North American electric grid, improving the accuracy of carbon reporting by a factor of 80,” said Ory Zik, Lux Research Vice President of Analytics and the team leader of Lux’s energy benchmarking. “The results show that many sites are far more reliant on coal than reported – notably, they include many large data centers.”
“For example, we found that Google underestimates its dependence on coal in four out of seven data centers, in particular at its Berkeley County, S.C. location.”
Google Data Centers
When comparing eGRID with the new Lux Grid Network Analysis (GNA) tool — which divides the US grid into 134 regions, instead of the previous 24, while making use of data provided by the US Energy Information Administration, which is updated monthly, as opposed to every three years — both Google and Amazon are underestimating the use of coal. Specifically, four of Google’s seven US data centers use more coal than reported by eGRID, subsequently increasing the company’s emissions levels. Amazon, though less transparent about how it calculates its emissions, uses approximately 43% electricity from coal, not the 35% that is inferred from using eGRID.
Amazon Data Centers
As Lux Research concludes, “The changing grid drives the need for better tools,” which means that investments into new electricity generation will need to be met by new advances in data collection and analysis.
Microsoft Takes the Cloud Underwater
While data center operators are going to have to get their heads around the possibility their operations weren’t as clean as they had envisioned, Microsoft researchers have decided to further complicate the imagery of “the cloud” by dropping a data center underwater.
Project Natick was a research project undertaken by a group within Microsoft Research that focuses on special projects. “We take a big whack at big problems, on a short-term basis,” explained Ben Cutler, Project Natick’s project manager. “We’re a small group, and we look at moonshot projects.”
Project Natick’s goal was to put a data center underwater that wouldn’t require maintenance by people — something the everyday data center is reliant upon. In the end, the researchers have learned a lot about how to make data centers more sustainable, while at the same time discovering ways moving data centers underwater can possibly speed data transmission and cloud deployment. Microsoft even believes that “someday, datacenters could become commonplace in seas around the world.”
The idea spun out of a paper that was presented at ThinkWeek. Sean James, one of the paper’s authors, had served in the Navy for three years on submarines. “I had no idea how receptive people would be to the idea,” he said. “It’s blown me away.”
“What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water,” James continued. “It goes through a very rigorous testing and design process. So I knew there was a way to do that.”
“This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,” said Norm Whitaker, who heads special projects for Microsoft Research NExT. “There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to datacenter partners. In a difficult situation, they could turn to this and use it.”