In a Deep, Icy Norwegian Fjord, An Abandoned Mine May Help Solve The Energy Problems Of The Internet

by David Biello, via OnEarth

Deep within a frozen mountainside, Norwegian engineers are hoping to create a fortress for data. Chilled by seawater drawn from the Nordfjord, about 230 miles northwest of Oslo, and bathed in ambient temperatures that remain at a constant 46 degrees Fahrenheit year-round, thousands of the giant computers that keep the Internet humming, each throwing off large amounts of heat, could remain permanently cool in the disused Lefdal mine, near the town of Måløy.

Norway has a reputation as a world leader in clean energy, and tiny Måløy, with a population of only 3,000, takes pride in its own recent emergence as a hub of green-tech development. The town is a pioneer in both onshore and offshore wind energy. One local company is building a power plant that will run on domestic and industrial waste and another that uses forest biomass. Other companies specialize in sustainable fisheries, wave-power technology, and energy-efficient windows and building materials. But the most ambitious of all these projects is the Lefdal mine, which its designers, LocalHost, promise will be the largest and greenest server farm in the world.

The Lefdal mine once produced olivine — an olive-green mineral that is used in the aluminum and steel industries and also supplies ballast for the foundations of offshore wind farms. (Perhaps the mineral’s most intriguing trick is its ability to soak up carbon dioxide from the air and bind it into rock.) The facility is vast. Lying next to the long, deep Nordfjord, it consists of five levels (with the potential to expand to 14), sprawling over some 1.3 million square feet of “white space” that can be used for storage, connected by a paved road that descends in a spiral through tunnels 45 feet wide and almost 30 feet high. Just one of those five levels, says Mats Andersson, the chief marketing officer for the data center project, “could host all the servers in Norway.”

The ethereal world of the Web has a very real physical presence. Behind every Google search, Facebook update, or Twitter tweet lies a gigantic computing infrastructure, at the heart of which sit massive server farms that collectively account for some 230 million tons of carbon dioxide emissions annually — more than emitted by the entire country of Argentina. Air-conditioning can consume as much as half the total power that digital giants like Google, Facebook, Twitter, Amazon, Apple, Microsoft, and IBM need to run their huge server facilities, and these are growing rapidly.

One solution is to move to a place that’s already cold. Naturally cold air and, better yet, cold water, can result in significant energy savings. Locations for server farms are being explored across the far north, from Alaska to Iceland. Google is operating a site in the Finnish town of Hamina. Facebook is building a server farm in Luleå, Sweden, just south of the Arctic Circle. Lefdal, which offers an abundance of clean, renewable energy from nearby hydroelectric dams and wind farms, as well as a unique cooling system that will pump icy cold water from about 650 feet below sea level, is expecting its first tenant to be IBM Norway. Andersson says that construction of the Lefdal data center will begin this fall and that “we will be in operation before summer 2013.”

Still, not all the world’s computing needs can find a home in the Arctic (or Antarctic), and that means other solutions will also be needed. In fact, companies that make the equipment for these server farms, such as Intel, have been focusing on a shift to operating at higher temperatures, so their data centers won’t have to migrate to frigid realms. “It’s not all about cooling,” notes Jonathan Koomey of Stanford University, who analyzes the industry. “You can also redesign servers to take hotter temperatures or find different ways to deliver the same computing service.”

Already, Intel’s most modern server equipment can operate at temperatures above 77 degrees Fahrenheit. “There is no performance degradation,” says Intel’s chief architect for data centers and cloud infrastructure, Charles Rego, noting that his company’s components have been designed to withstand temperatures as high as 95 degrees. “For every degree Celsius [1.8 degrees Fahrenheit] you move up, it’s a 4 to 5 percent energy savings on cooling.” That translates into millions of dollars in lowered costs for a large server farm.

One of Facebook’s latest server farms, in Prineville, Oregon, cools its data center entirely with the surrounding air. An Intel experiment in New Mexico showed that air from outside could be used to keep a 900-server facility operating even on a 91-degree day. And Intel has designed new layouts for its motherboards — the etched wafers that house the elements of the computing system — so that one processor does not heat up another, allowing more efficient cooling. As a result, the servers of today are typically at least five times more energy efficient than those of just five years ago. Intel holds out hopes for even higher temperature operation, above 104 degrees Fahrenheit.

“It gets rid of water use,” Rego explains. “Water consumption at these server farms is the hidden dragon,” particularly, he adds, as parts of the globe face water shortages. Right now, data centers go through roughly 80 billion gallons of water annually for cooling, according to Intel — much of which is not recycled. That isn’t a problem for a facility like Lefdal, which returns the warmed water straight to Nordfjord.

There seems to be no end to the demand for additional computing resources. Keeping this escalating demand from sucking up ever more energy will be vital, and the present trend toward greater efficiency needs to continue. Koomey’s research is encouraging, suggesting that the power needed to perform a given task will decrease a hundredfold every decade. In addition, among other common-sense solutions, some companies now throttle back the number of servers in operation when there is little demand, rather than running them all the time. Others are redesigning their servers so they use less energy when they are not actively processing data.

Of course, even the most energy-efficient computer draws more power the more processing it has to do. That makes software a big — though hidden — part of the problem. Some unwieldy computer programs still contain instructions written in the 1950s, so updating software for energy efficiency represents another opportunity.

In the meantime, however, a defunct mine in Norway can keep a data center nice and cool. And there’s something very fitting in the fact that a mine containing a mineral capable of soaking up CO2 should now be used to house a server farm that will emit less CO2 in the first place.

David Biello is the associate editor for environment and energy at Scientific American. This piece was originally published at OnEarth and was reprinted with permission.

7 Responses to In a Deep, Icy Norwegian Fjord, An Abandoned Mine May Help Solve The Energy Problems Of The Internet

  1. Doug Bostrom says:

    Years ago I had a conversation w/a business colleague connected w/HydroQuebec about HQ’s generating facilities in the northern part of the province being a great place to locate data centers. Low temperatures, reduced electrical transmission loss, isolated from social disruption and served by rail lines which make fiber installation a snap.

    Would still be a good plan…

  2. AA says:

    “Norway has a reputation as a world leader in clean energy”

    Norway’s clean energy development is minor compared to its massive oil exports.

  3. Merrelyn Emery says:

    Better the Arctic is used for servers than oil rigs, ME

  4. Merrelyn Emery says:

    Ideally it should be treated as a protected wilderness like Antarctica but I’ve just about given up hope for that, ME

  5. Ron Broberg says:

    Tshings like this will be why can’t kill SkyNet.

  6. ozajh says:

    This is a good idea, BUT

    We need to keep in mind that what is being minimised by this sort of design is the energy used in removing waste heat. There is only a tiny effect on the energy actually used by the servers and other equipment.

    Optimising the design of your facility, and if you are a sufficiently large user the design of the individual servers etc., will save FAR more energy. All the virtualisation vendors were citing cases 5 or more years ago where a redesign could reduce overall energy usage by 80%+.

    (And yes, I am fully aware that some, maybe most, of this low-hanging fruit has largely been harvested; certainly so in the case of users at the scale of Google.)

  7. Derek says:

    Very interesting, and in keeping with a lot of fascinating recent conversation.

    As I assert in a post inspired by this story (, the climate is one of several ways that the physical infrastructure of the internet has physical effects we need to keep in mind.

    The internet doesn’t exist in the ether, and so has environmental impacts. And when we use it, our bodies and experiences don’t disappear, which has serious cultural and social-justice implications.