I wonder why nobody has tried geothermal liquid cooling. You could do it 2 ways. Either with a geothermal heat pump set up or cut out the middle man and just use the earth like you would a radiator in a liquid cooling loop. The only problem would be how many wells you would have to drill to cool up to 100MW (I'm thinking 20+ at a depth of at least 50ft).
Its kind of easier to just use a nearby river than dig for and pump up ground water. That's what power stations and big chemical factories do. For everybody else, air-cooling is just easier and less expensive.
You wouldn't be drilling for water. You drill a well so you can put pipe in it, fill it back up and then pump water through the pipes using the earth's constant temp (~20c) to cool your liquid which is warmer (>~30c).
I experimented with this (mathematically) and found that heat soak is a serious, variable, concern. If the new moisture is coming from the surface, this is not as much of an issue, but if it isn't, you could have a problem in short order. Then there are the corrosion and maintenance issues...
The net result is that it is cheaper and easier to just install a few ten thousand gallon coolant holding tanks and keep them cool (but above ambient) and to cool the air in the server room(s). These tanks can be put inside a hill or in the ground for extra installation and a surface radiator system could allow using cold outside air to save energy.
You obviously dont know have a clue about drilling costs. For a 2,000 s.f. home, a geothermal driller needs between 200-300 lineal feet of well bore to cool the house. In unconsolidated material, drilling costs per foot range from $15-$30/foot, depending on the rig. For drilling in rock, up the cost to $45/foot. For something that uses 80,000x more power than a typical home, what do you think the drilling costs would be? Go back to heating up Hot Pockets.
Geothermal heat pumps are only moderately more efficient than standard air conditioning and require an enormous amount of area. 20 holes at a depth of 50ft would handle the cooling requirements for a large residential home, but wouldn't even approach the requirements for a data center. One related possibility is to drill to a nearby aquifer and draw cool water, run it through a heat exchanger, then exhaust warm water into the same aquifer. Unfortunately, water overuse has been drained aquifers such that even the pumping costs would be substantial, and the aquifers will eventually be drained to the point that vacuum-based pumps can no longer draw water.
They are a lot more efficient at heating, but only mildly more efficient at cooling. They also are really storing heat in the ground in the summer and taking it back in the winter, so if you only store heat you can actually have a problem long-term. Your essentially using the ground as a long-term heat storage device since the ground is between 50-60 degrees depending on your area of the country, but use of the geothermal changes that temperature. An air source makes much more sense since you share the air with everyone else and it essentially just blows away.
Wells don't use vacuum based pumps most aquifers are much to deep for that instead you stick the pump in the bottom of the well and push the water to the service.
It's been done already. I know I've seen it in an article on new data centers in one industry publication or another. A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".
Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.
I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.
I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?
Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.
Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.
I think you oversimplify if you just judge the efficiency of the cooling method by the heat capacity of the medium. The medium is not a heat-battery that only absorbs the heat, it is also moved in order to transport energy. And moving air is much easier and much more efficient than moving water.
So I think in the case of Finland the driving fact is that they will get Air temperatures of up to 30°C in some summers, but the water temperature at the bottom regions of the gulf of Finland stays below 4°C throughout the year. If you would consider a data center near the river Nile, which is usually just 5°C below air temperature, and frequently warmer than the air at night, then your efficiency equation would look entirely different.
Naturally, building the center in Finland instead of Egypt in the first place is a pretty good decision considering cooling efficiency.
Isn't moving water significantly more efficient than moving air because a significant amount of energy when trying to move air goes to compressing it rather than moving it, where water is largely incompressible?
For the initial acceleration this might be an effect, though energy used for compression isn't necessary lost, as the pressure difference will decay via motion of the air again (but maybe not in the preferred direction. But if you look into the entire equation for a cooling system, the hard part is not getting the medium accelerated, but to keep it moving against the resistance of the coolers, tubes and radiators. And water has much stronger interactions with any reasonably used material (metal, mostly) than air. And you usually run water through smaller and longer tubes than air, which can quickly be moved from the electronics case to a large air vent. Also the viscosity of water itself is significantly higher than that of air, specifically if we are talking about cool water not to far above the freezing point of water, i.e. 5°C to 10°C.
Sir, I can assure you the Nordic Sea hits ~20°C in the summers. But still that tempereture is good enough for cooling.
In Helsinki they are now collecting the excess heat from data center to warm up the houses in the city area. So that too should be considered. I think many countries could use some "free" heating.
Surface temp does, but below the surface it's cooler. Even in small lakes and rivers, otherwise our drinking water would be unusable and 25°C out of the tap. You would get legionella and stuff then. In Sweden the water is not allowed to be or not considered to be usable over 20 degrees at the inlet or out of the tap for that matter. Lakes, rivers and oceans could keep 2-15°C at the inlet year around here in Scandinavia if the inlet is appropriately placed. Certainly good enough if you allow temps over the old 20-22°C.
OVH's datacentre here in Montreal cools using a centralized watercooling system and relies on convection to remove the heat from the server stacks, IIRC. They claim a PUE of 1.09
Exactly what i was about to post. Why Facebook, Microsoft and even Google didn't manage to outpace them. PUE 1.09 is still as far as i know an Industry record. Correct me if i am wrong.
This entire idea seems so obvious it's surprising they haven't been doing this the whole time. Oh well, it's hard to beat an idea that cheap and efficient.
there's a lot of work being done on the UPS side of the power consumption coin too - FB uses both Delta DC UPS' that power their equipment directly at DC from the batteries instead of the wasteful invert to 480vac three phase, then rectify again back at the server PSU level, and Eaton equipment with ESS that bypasses the UPS until there's an actual power loss (for about a 10% efficiency pickup when running on mains power)
Yeah there is a lot of movement in this these days, but the hard part of doing this is at the low voltages used in servers <=24v, you need a massive amount of current to feed several racks of servers, so you need massive power bars and of course you can lose a lot of efficiency on that side as well.
Microsoft is building a massive data center in my home state just outside Cheyenne, WY. I wonder why more companies haven't done this yet? Its very dry and days above 90F are few and far between in the summer. Seems like an easy cooling solution versus all the data centers in places like Dallas.
Building in the cooler climes is great - but you also need the networking infrastructure to support said big data center. Heck for free cooling, build the data centers in the far frozen reaches of Northern Canada, or in Antarctica. Only, how will you get the data to the data center?
Its actually right along the I-80 corridor that connects Chicago and San Francisco. Several major backbones run along that route and its why many mega data centers in Iowa are also built along I-80. Microsoft and the NCAR Yellowstone super computer are there so the large pipe is definitely accessible.
That map from Europe is certainly plain wrong. Especially in Spain btu also Greece and italy easily have some day above 35. It also happens couple of days per year were I live, a lot more north than any of those.
Do you really get 35°C, in the shade, outside, for more than 260 hours a year? I'm sure it happens for a few hours a day in the two hottest months, but the map does cap out at 8500 out of 8760 hours.
What about wear&tear at running the equipment at hotter temperatures? I remember seeing the chart where higher temperature = shorter life span. I would imagine the OEM's have engineered a bit over this and warranties aside, it should be basic physics?
Actually, the IT equipment (servers & networking) use more power than the cooling equipment. ref: http://www.electronics-cooling.com/2010/12/energy-... "The IT equipment usually consumes about 45-55% of the total electricity, and total cooling energy consumption is roughly 30-40% of the total energy use"
That is the whole point, isn't it? IT equipment uses power to be productive, everything else is supporting the IT equipment and thus overhead that you have to minimize. From the facility power, CRACs are the most important power gobblers.
On the first page you mention "The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1."
So I wonder when Google will build a data center in say North Dakota. Combine the ample wind power with cold and it looks like a perfect place for a green data center.
Does anyone by chance have a recorded data of Temperature and processor's speed in a server room? Or can someone give me the information about the high-end and low-end values measured in any of the server rooms respectively, considering the equation temperature v/s processor's speed?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
iTzSnypah - Tuesday, February 11, 2014 - link
I wonder why nobody has tried geothermal liquid cooling. You could do it 2 ways. Either with a geothermal heat pump set up or cut out the middle man and just use the earth like you would a radiator in a liquid cooling loop. The only problem would be how many wells you would have to drill to cool up to 100MW (I'm thinking 20+ at a depth of at least 50ft).ShieTar - Tuesday, February 11, 2014 - link
Its kind of easier to just use a nearby river than dig for and pump up ground water. That's what power stations and big chemical factories do. For everybody else, air-cooling is just easier and less expensive.iTzSnypah - Tuesday, February 11, 2014 - link
You wouldn't be drilling for water. You drill a well so you can put pipe in it, fill it back up and then pump water through the pipes using the earth's constant temp (~20c) to cool your liquid which is warmer (>~30c).looncraz - Tuesday, February 11, 2014 - link
I experimented with this (mathematically) and found that heat soak is a serious, variable, concern. If the new moisture is coming from the surface, this is not as much of an issue, but if it isn't, you could have a problem in short order. Then there are the corrosion and maintenance issues...The net result is that it is cheaper and easier to just install a few ten thousand gallon coolant holding tanks and keep them cool (but above ambient) and to cool the air in the server room(s). These tanks can be put inside a hill or in the ground for extra installation and a surface radiator system could allow using cold outside air to save energy.
superflex - Wednesday, February 12, 2014 - link
You obviously dont know have a clue about drilling costs.For a 2,000 s.f. home, a geothermal driller needs between 200-300 lineal feet of well bore to cool the house. In unconsolidated material, drilling costs per foot range from $15-$30/foot, depending on the rig. For drilling in rock, up the cost to $45/foot.
For something that uses 80,000x more power than a typical home, what do you think the drilling costs would be?
Go back to heating up Hot Pockets.
chadwilson - Wednesday, February 19, 2014 - link
That last statement was totally unnecessary. Your perfectly valid point was tarnished by your awful attitude.nathanddrews - Tuesday, February 11, 2014 - link
Small scale, but really cool. Use PV to power your pumps...http://www.overclockers.com/forums/showthread.php?...
Sivar - Tuesday, February 11, 2014 - link
Geothermal heat pumps are only moderately more efficient than standard air conditioning and require an enormous amount of area. 20 holes at a depth of 50ft would handle the cooling requirements for a large residential home, but wouldn't even approach the requirements for a data center.One related possibility is to drill to a nearby aquifer and draw cool water, run it through a heat exchanger, then exhaust warm water into the same aquifer. Unfortunately, water overuse has been drained aquifers such that even the pumping costs would be substantial, and the aquifers will eventually be drained to the point that vacuum-based pumps can no longer draw water.
rkcth - Tuesday, February 11, 2014 - link
They are a lot more efficient at heating, but only mildly more efficient at cooling. They also are really storing heat in the ground in the summer and taking it back in the winter, so if you only store heat you can actually have a problem long-term. Your essentially using the ground as a long-term heat storage device since the ground is between 50-60 degrees depending on your area of the country, but use of the geothermal changes that temperature. An air source makes much more sense since you share the air with everyone else and it essentially just blows away.biohazard918 - Tuesday, February 11, 2014 - link
Wells don't use vacuum based pumps most aquifers are much to deep for that instead you stick the pump in the bottom of the well and push the water to the service.lwatcdr - Thursday, February 20, 2014 - link
Here in south florida it would probably be cheaper. The water table is very high and many wells are only 35 feet deep.rrinker - Tuesday, February 11, 2014 - link
It's been done already. I know I've seen it in an article on new data centers in one industry publication or another.A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
noeldillabough - Tuesday, February 11, 2014 - link
I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
superflex - Wednesday, February 12, 2014 - link
The enviroweenies would scream about you defrosting the permafrost.Some slug or bacteria might become endangered.
evonitzer - Sunday, February 23, 2014 - link
Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.fluxtatic - Wednesday, February 12, 2014 - link
I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
Varno - Tuesday, February 11, 2014 - link
You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?JohanAnandtech - Tuesday, February 11, 2014 - link
Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.Kristian Vättö - Tuesday, February 11, 2014 - link
Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.JohanAnandtech - Tuesday, February 11, 2014 - link
I admit, I somewhat ignored the Scandinavian datacenters as "free cooling" is a bit obvious there. :-)I thought some readers would be surprised to find out that even in Sunny California free cooling is available most of the year.
ShieTar - Tuesday, February 11, 2014 - link
I think you oversimplify if you just judge the efficiency of the cooling method by the heat capacity of the medium. The medium is not a heat-battery that only absorbs the heat, it is also moved in order to transport energy. And moving air is much easier and much more efficient than moving water.So I think in the case of Finland the driving fact is that they will get Air temperatures of up to 30°C in some summers, but the water temperature at the bottom regions of the gulf of Finland stays below 4°C throughout the year. If you would consider a data center near the river Nile, which is usually just 5°C below air temperature, and frequently warmer than the air at night, then your efficiency equation would look entirely different.
Naturally, building the center in Finland instead of Egypt in the first place is a pretty good decision considering cooling efficiency.
icrf - Tuesday, February 11, 2014 - link
Isn't moving water significantly more efficient than moving air because a significant amount of energy when trying to move air goes to compressing it rather than moving it, where water is largely incompressible?ShieTar - Thursday, February 13, 2014 - link
For the initial acceleration this might be an effect, though energy used for compression isn't necessary lost, as the pressure difference will decay via motion of the air again (but maybe not in the preferred direction. But if you look into the entire equation for a cooling system, the hard part is not getting the medium accelerated, but to keep it moving against the resistance of the coolers, tubes and radiators. And water has much stronger interactions with any reasonably used material (metal, mostly) than air. And you usually run water through smaller and longer tubes than air, which can quickly be moved from the electronics case to a large air vent. Also the viscosity of water itself is significantly higher than that of air, specifically if we are talking about cool water not to far above the freezing point of water, i.e. 5°C to 10°C.easp - Saturday, February 15, 2014 - link
Below Mach 0.3, air flows can be treated as incompressible. I doubt bulk movement of air in datacenters hits 200+ Mphjuhatus - Tuesday, February 11, 2014 - link
Sir, I can assure you the Nordic Sea hits ~20°C in the summers. But still that tempereture is good enough for cooling.In Helsinki they are now collecting the excess heat from data center to warm up the houses in the city area. So that too should be considered. I think many countries could use some "free" heating.
Penti - Tuesday, February 11, 2014 - link
Surface temp does, but below the surface it's cooler. Even in small lakes and rivers, otherwise our drinking water would be unusable and 25°C out of the tap. You would get legionella and stuff then. In Sweden the water is not allowed to be or not considered to be usable over 20 degrees at the inlet or out of the tap for that matter. Lakes, rivers and oceans could keep 2-15°C at the inlet year around here in Scandinavia if the inlet is appropriately placed. Certainly good enough if you allow temps over the old 20-22°C.Guspaz - Tuesday, February 11, 2014 - link
OVH's datacentre here in Montreal cools using a centralized watercooling system and relies on convection to remove the heat from the server stacks, IIRC. They claim a PUE of 1.09iwod - Tuesday, February 11, 2014 - link
Exactly what i was about to post. Why Facebook, Microsoft and even Google didn't manage to outpace them. PUE 1.09 is still as far as i know an Industry record. Correct me if i am wrong.I wonder if they could get it down to 1.05
Flunk - Tuesday, February 11, 2014 - link
This entire idea seems so obvious it's surprising they haven't been doing this the whole time. Oh well, it's hard to beat an idea that cheap and efficient.drexnx - Tuesday, February 11, 2014 - link
there's a lot of work being done on the UPS side of the power consumption coin too - FB uses both Delta DC UPS' that power their equipment directly at DC from the batteries instead of the wasteful invert to 480vac three phase, then rectify again back at the server PSU level, and Eaton equipment with ESS that bypasses the UPS until there's an actual power loss (for about a 10% efficiency pickup when running on mains power)extide - Tuesday, February 11, 2014 - link
Yeah there is a lot of movement in this these days, but the hard part of doing this is at the low voltages used in servers <=24v, you need a massive amount of current to feed several racks of servers, so you need massive power bars and of course you can lose a lot of efficiency on that side as well.drexnx - Tuesday, February 11, 2014 - link
afaik, the Delta DC stuff is all 48v, so a lot of the old telecom CO stuff is already tailor-made for use there.but yes, you get to see some pretty amazing buswork as a result!
Ikefu - Tuesday, February 11, 2014 - link
Microsoft is building a massive data center in my home state just outside Cheyenne, WY. I wonder why more companies haven't done this yet? Its very dry and days above 90F are few and far between in the summer. Seems like an easy cooling solution versus all the data centers in places like Dallas.rrinker - Tuesday, February 11, 2014 - link
Building in the cooler climes is great - but you also need the networking infrastructure to support said big data center. Heck for free cooling, build the data centers in the far frozen reaches of Northern Canada, or in Antarctica. Only, how will you get the data to the data center?Ikefu - Tuesday, February 11, 2014 - link
Its actually right along the I-80 corridor that connects Chicago and San Francisco. Several major backbones run along that route and its why many mega data centers in Iowa are also built along I-80. Microsoft and the NCAR Yellowstone super computer are there so the large pipe is definitely accessible.darking - Tuesday, February 11, 2014 - link
We've used free cooling in our small datacenter since 2007. Its very effective from september to april here in Denmark.beginner99 - Tuesday, February 11, 2014 - link
That map from Europe is certainly plain wrong. Especially in Spain btu also Greece and italy easily have some day above 35. It also happens couple of days per year were I live, a lot more north than any of those.ShieTar - Thursday, February 13, 2014 - link
Do you really get 35°C, in the shade, outside, for more than 260 hours a year? I'm sure it happens for a few hours a day in the two hottest months, but the map does cap out at 8500 out of 8760 hours.juhatus - Tuesday, February 11, 2014 - link
What about wear&tear at running the equipment at hotter temperatures? I remember seeing the chart where higher temperature = shorter life span. I would imagine the OEM's have engineered a bit over this and warranties aside, it should be basic physics?zodiacfml - Wednesday, February 12, 2014 - link
You just need constant temperature and equipment that works at that temperature. Wear and tear happens significantly at temperature changes.bobbozzo - Tuesday, February 11, 2014 - link
"The main energy gobblers are the CRACs"Actually, the IT equipment (servers & networking) use more power than the cooling equipment.
ref: http://www.electronics-cooling.com/2010/12/energy-...
"The IT equipment usually consumes about 45-55% of the total electricity, and total cooling energy consumption is roughly 30-40% of the total energy use"
Thanks for the article though.
JohanAnandtech - Wednesday, February 12, 2014 - link
That is the whole point, isn't it? IT equipment uses power to be productive, everything else is supporting the IT equipment and thus overhead that you have to minimize. From the facility power, CRACs are the most important power gobblers.bobbozzo - Tuesday, February 11, 2014 - link
So, who is volunteering to work in a datacenter with 35-40C cool aisles and 40-45C hot aisles?Thud2 - Wednesday, February 12, 2014 - link
80,0000, that's sounds like a lot.CharonPDX - Monday, February 17, 2014 - link
See also Intel's long-term research into it, at their New Mexico data center: http://www.intel.com/content/www/us/en/data-center...puffpio - Tuesday, February 18, 2014 - link
On the first page you mention "The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1."This article says that Facebook has a achieved a PUE of 1.07 (https://www.facebook.com/note.php?note_id=10150148...
lwatcdr - Thursday, February 20, 2014 - link
So I wonder when Google will build a data center in say North Dakota. Combine the ample wind power with cold and it looks like a perfect place for a green data center.Kranthi Ranadheer - Monday, April 17, 2017 - link
Hi Guys,Does anyone by chance have a recorded data of Temperature and processor's speed in a server room? Or can someone give me the information about the high-end and low-end values measured in any of the server rooms respectively, considering the equation temperature v/s processor's speed?