Cooling for a small server room

Solution 1:

I would use a portable air conditioning unit that consumed it's own condensation and kept the room at 50% relative humidity. Some people argue that dew point is a better metric, but I haven't looked into it deeply enough and am willing to be corrected. Some A/C units will push their condensation through a hose that snakes through the A/C unit's exhaust hose and thus gets evaporated into the hot, dry output air and into the plenum and exhaust system. You could also get an A/C unit that has condensation tanks and just remember to empty them every day. Annoying, yes. But sometimes an admins gotta do what an admins gotta do.

Either way, you must have exhaust. You can run a duct hose from the A/C's output into the plenum space or near the outtake. That might not be sufficient though. IMO, A/C output is your biggest problem here. Might be nice in winter, you can barter with different departments on which cubicle space the exhaust hose will run to each week. :)

Plenty of companies make portable cooling units made for permanent usage in server rooms. Some companies include:

  • Atlas
  • Topaz
  • Movin Cool (I think Atlas bought Movin' Cool, or the other way around, or something else. There seems to be some relationship between the two companies that I haven't determined yet)

I blogged about portable A/C units a little while back on my old blog here: http://thenonapeptide.blogspot.com/2009/12/list-of-portable-cooling-units.html

Solution 2:

You really don't want to be stuffing about with this sort of thing or relying on guesswork. Just for starters I suggest you get that ceiling finished off and fitted with an exhaust fan or two.

For scaling the A/C I suggest talking to the A/C people, who know a lot more about this stuff than we do. Tell them the load you're currently running, after adding a bit more than than the maximum you can reasonably expect it to ever be. Don't forget to mention that it needs be be running 24/7. I actually prefer to have multiple units, each of which can handle the normal load, even if it's straining to do so. That way you can more easily deal with a failed unit.

In simple terms, the electricity your gear consumes exits as heat. Work out, or measure if you have the gear, how much you're using. If in doubt get an electrician to measure it for you. It's a small investment that can save you a lot of money later.


Solution 3:

I've used one of the portable units in the past, and during peak temperatures, we were able to keep things about 20 degrees cooler. We'd also put some insulation on top of the drop ceilings, which seemed to help quite a bit. On a side note, you're lucky for the lack of windows - this "server room" had an entire wall of windows, which we went to great lengths to block off, as it was a huge source of additional heat.


Solution 4:

Interesting points to consider:

(I'm not sure why, but I seem to be the counterpoint to chris S on many things. Nothing personal Chris, it's just that my experience seems to run counter to yours. I'm a fellow old guy, but man, we must come from opposite universes.)

  • A 750VAC UPS, even if fully loaded with an army of toasters, does NOT require a ton of cooling. Even at the most aggressive cooling you might need 2800BTU to dump the heating load. Now, if you're running your data center in a solar collector, YMMV, but most folks don't do that. Wanna spend money on the assumption that you'll go back to circa 1999 data centers? Do you secretly hope that your firm will need to become a protein-folding rendering farm and Viennese Pastry Bakery? Go ahead and sink a fortune into cooling.

  • People running old school data centers (I have several that date back to the 1980's) often discover that (after the expensive 5 ton cooling system fails) they can actually cool their data center with a couple of ~5000W wall mounted a/c units and pay substantially less for A/C because EER ratings have grown quite a bit. Even switching to 100vac from 208.

  • You're better off to have two a/c units than one in most cases if you really care about reliability. The life span of consumer equipment is often 70% or more of that of commercial equipment. I'm a big BIG fan of split wall mounts, but if you have outside access, those wallmounts do a fair job - especially if they have economy cool options.

  • You might take into account the power of in-house cooling/ventilation. In many cases, having adequate ventilation dramatically lowers your cost of cooling. A big storage room with a rack of gear is better than a small room, because convection will dump your heat load. However, assisting with building A/C or circulating air is a good idea, if possible. Even fans can make a huge difference.

  • You don't have to run at -34F. In fact, most equipment does just fine @68 to 74F. There are reasons to run an ice cream shop/data center, but 90% of commercial users don't need it. In fact, as long as you stay below ~79F, you'll probably be in good shape. This is adequate for your typical 5-10 box windows clusterz that most small-medium businesses run. If you have an army of undead MCSEs and your volume licensing costs approach parity with the IT directors salary, you might as well waste money on providing overkill HVAC.

The single exception: If your electrical wiring and panel aren't really cut out for your needs (as in you might need to arc weld or run an exchange server, for instance), keeping the A/C below 85F is a good idea - breakers typically pop based on thermal expansion, and a hot breaker box will trip when it gets warm.

I'm ok with 74F but I wouldn't get much above 80F in a data center - poorly built electronic equipment (and that's basically everything built by every manufacturer in the last two decades) may fail or become flaky above 80 degrees. The old idea of keeping everything at -60F was based on even crappier equipment - things built back when tubes and selenium components were common. This equipment would eat itself if it got a little hot. Nowadays, it just stops working correctly.


Solution 5:

Our server room is about twice as big and has about twice as many servers. Our A/C unit (also humidifier) dies - which is getting to be fairly frequent as it's 20 years old - we open the door between the server room and our adjacent workroom and either put a big fan in the doorway to blow cooler air in or put a portable A/C unit in the room with its exhaust in the adjacent room.

Neither of those are good long-term solutions, they make me think of this classic article about flawed quick fixes. The best solution would be to hook the server room up to the building's cold air supply, but you commented that you didn't think duct work was possible.

Other than that, Wesley beat me to suggesting a portable unit, that's probably your only solution, as long as you can duct the exhaust properly.