Dehumidification Design Using Data-Center Heat

March 1, 2008
Utilizing "free" heat from a server room was the concept behind an air-handling system in a college-campus building.

Considerable energy savings are possible with an innovative approach to HVAC dehumidification design using the typical network/server room or data center in a campus building as a source of “free” heat. Just a few racks in the network room of a campus building, such as those in the Bush Science Center at Rollins College in Winter Park, Fla., represent enough heat to provide considerable dehumidification. A secondary benefit is that more airflow and cooling is available for a data room than typically is provided, a bonus that delights even the most skeptical information-technology manager.

What's more, this dehumidification strategy may help earn credits in the Optimize Energy Performance, Thermal Comfort, and/or Innovation in Design categories of the U.S. Green Building Council's Leadership in Energy and Environmental Design (LEED) for New Construction (LEED-NC) and LEED for Existing Buildings (LEED-EB) Green Building Rating Systems.

CONCEPTUAL

The strategy for achieving dehumidification, energy savings, and LEED credits using this design concept is straightforward: Capture data-center equipment heat and utilize it for reheat. However, success depends on practical execution. The heat produced by data-processing equipment is fairly constant and predictable year-round. The heat from just one computer-server rack can be anywhere from 4 to more than 12 kw (12 to 40 mbh). Some experts estimate that the rise of high-density environments and designs for future equipment will result in an output of 30 to 50 kw per rack (100 to 170 mbh).

Reheat of just a few degrees may be all that is required for good humidity control in a mixed-air application. Every 10 kw of server heat (about one rack) is sufficient to raise the temperature of more than 5,000 cfm of airflow by 6°F, from a leaving-coil condition of 54°F/53°F dry bulb/wet bulb to 60°F/55.5°F dry bulb/wet bulb. In terms of dehumidification capability, this 10 kw of server heat per 5,000 cfm will lower the sensible-heat ratio (SHR) by about 10 percent, depending on space-air and entering-air conditions. With room air at 76°F/55-percent relative humidity, SHR is reduced from 0.69 to 0.62 with 6°F of reheat. As data-center equipment is upgraded, the kilowatt density likely will increase, making this strategy ever more effective.

Applications with higher occupant densities and outside-air fractions could require closer to 10°F of reheat. Each typical 10-kw server rack will provide enough reheat for 3,200 cfm of supply air and lower SHR by 20 percent. With room air of 76°F/55-percent relative humidity, SHR is reduced from 0.69 to 0.55 with 10°F of reheat. For the most demanding dehumidification loads with humid outside air, every 10 kw of rejected server-rack heat will provide almost 15°F of reheat per 2,000 cfm. A nearly neutral supply-air condition of 70°F/59.3°F dry bulb/wet bulb is provided from the 54°F/53°F dry bulb/wet bulb leaving-coil condition. With room air at 76°F/55-percent relative humidity, SHR is reduced by 45 percent, from 0.69 to 0.38, with 15°F of reheat.

Therefore, a project's server room or data center — preferably located near an occupied space or air handler serving the space that needs added dehumidification — might be a good application for this concept. If the size of the conditioned space is approximately 2,500 to 6,500 sq ft per 10-kw server rack, take a closer look at how the air handler and ductwork could be configured.

IMPLEMENTATION

Capturing and utilizing “free” heat, rather than rejecting it to the outdoors, was the design basis for the air-handling system at the Bush Science Center. The building houses a high-tech classroom auditorium complete with dual rear-projection screens and several wall-hung liquid-crystal-display monitors. One floor below is a data center that has six server racks with a total power rating of 50.2 kw and future plans for five additional racks. The data center originally was cooled with a 15-ton direct-expansion (DX) system that now is used for backup. Problems in the building included excess humidity in the auditorium and an ever-increasing heat load in the data-server room.

Rollins College's facilities director, Scott Bitikofer, asked for a simple energy-efficient solution that would handle cooling and dehumidification loads. The new air-handler modules are in a traditional single-fan hot-deck/cold-deck configuration. Mixed air from the data center flows through the hot deck and mixes with chilled-water coil leaving air as needed to achieve comfort conditions. The auditorium is served by a 10,500-cfm modular air-handling unit (AHU) with a variable frequency drive (VFD). The system is supplied with chilled water from a campus central plant.

If static-pressure relationships had not allowed use of a single fan, a dual-fan configuration would have been used. The dual-fan configuration could have provided more dehumidification capability and airflow flexibility, but the two fans, associated VFDs, and controls would have increased the installation costs. For the Rollins College application, value engineering indicated that a single-fan configuration provided adequate performance and was a better choice economically.

CONTROLS

Temperature control in the auditorium of Bush Science Center is provided by a thermostatic sensor that first modulates the variable-air-volume (VAV) box as in a conventional system. Under low loads, the VAV box tends to close, so a carbon-dioxide (CO2) sensor in the auditorium overrides the damper position to ensure enough fresh air is being delivered. The VAV box modulates open and closed to maintain a differential CO2 level of less than 700 ppm between the indoors and outdoors. If the auditorium is over-cooled because of the opening VAV box, the hot-deck and cold-deck dampers in the AHU modulate to meet the auditorium set-point temperature by mixing data-center air with the supply air.

Humidity control is activated when the auditorium's level exceeds 60-percent relative humidity. Excess humidity causes the hot-deck damper to modulate open, increasing the temperature and lowering the humidity of the auditorium's supply air. The temperature of the supply air flowing to the auditorium has a wide range of capability, from 54°F (without the mixing of warm and chilled air) to higher than 80°F (with full chilled-water-coil bypass). This is achieved without the use of “new” energy for reheat, using only the heat collected from the computer-server equipment. The supply-fan VFD speed is varied to achieve the duct-static-pressure set point, and the supply-air leaving temperature is designated according to a reset schedule, as in a conventional configuration. Smoke detectors in the supply-air ducts, data-server room, and underfloor air plenum de-energize the supply fan and close the hot-aisle ceiling return fire dampers.

Temperature in the data-center office is controlled via a plenum-mounted fan-powered mixing box that mixes cold supply air from the third AHU (AHU-3) with return air from the data-server hot aisle to meet the office temperature set point. To reduce the number of potential failure points, there is no temperature control in the data-server area — fully chilled air is delivered at all times directly from the air handler to the underfloor plenum — and the VFD is equipped with an external bypass in case of failure. The original 15-ton DX system is used as a backup controlled by a simple wall thermostat set to 78°F. A humidity sensor in the data-server area is used to alarm the facility manager via pager and e-mail if humidity is outside of the 30- to 70-percent range, which would indicate an under- or overcooling situation.

The data-center return-air plenum connects to the auditorium return duct via the air-handler filter-mixing module. Potential sound pathways from server fans and power supplies were addressed. Sound levels in the server room reach 79 dB when the fans are at full speed, while the auditorium classroom has a design criterion of Noise Criteria- (NC-) 30. An acoustic baffle and 2-in.-thick sound liner with an acoustical value of Noise Reduction Coefficient- (NRC-) 0.95 in the return plenum achieve noise control in the auditorium return-air path. Two sound attenuators isolate noise of the auditorium ductwork from the data servers.

CONCLUSION

Exhaust from the hot aisle of a server room can be utilized effectively for reheat when required (based on the number of servers and size of each server in a project). It is important to study the airflow requirements and SHR for each of the spaces conditioned by a common air-conditioning system under various load conditions to evaluate and design the most optimized system. Higher supply-air exchange rates in a server room would result in more uniform cooling. This can be used effectively for humidity control in an adjacent space. Even though humidity control can be achieved by varying the airflow rates and maintaining a constant supply-air temperature, minimum supply airflow rates dictated by ventilation codes demand some form of reheat for comfort control, especially in hot and humid climates.

Based on cost, space limitations, and ductwork constraints of the existing systems, mixed-air bypass was selected for the Rollins College project. Return-air bypass with an additional fan also can be used effectively for better humidity control in new designs with this approach. The mixed-air bypass has been in successful operation and meeting the temperature and humidity-control requirements of the auditorium and data center for more than a year. Utilizing a well-defined sequence of operations and a dedicated control system is key to achieving success in such designs.

A member of HPAC Engineering's Editorial Advisory Board, Mike West, PhD, PE, has 12 years of experience as a principal with Advantek Consulting Inc. Kannan Rengarajan, PE, has more than 25 years of design experience, including 10 years as a principal and founding senior mechanical engineer with Cape Design Engineering Co.