Outdoor air in data centers

Buildings leak air. Sometimes this leakage actually can produce unintended favorable results such as providing additional ventilation air for occupants. However, this uncontrolled leakage typically will have a negative impact on indoor temperature and humidity, and must be accounted for in the design process.

01/01/2008


Contents:
What is NEBS?
Location plays a key role
Using outside air (Web exclusive)

Buildings leak air. Sometimes this leakage actually can produce unintended favorable results such as providing additional ventilation air for occupants. However, this uncontrolled leakage typically will have a negative impact on indoor temperature and humidity, and must be accounted for in the design process.

 

Engineers who design HVAC systems for data centers understand that computers require an environment in which temperature and humidity are maintained in accordance with the computer manufacturers’ recommendations, ASHRAE guidelines (“Thermal Guidelines for Data Processing Environments” developed by the Mission Critical Facilities Technical Committee 9.9), and the Telcordia Network Equipment-Building System requirements.

 

Modern data center facilities typically are designed to provide air to the inlet of the computer equipment that ranges from 68 F to 77 F, and 40% to 55% RH. Since maintaining these temperature and humidity tolerances for 8,760 h/yr is very energy-intensive, much attention and research currently is aimed at HVAC system control strategies and system efficiencies to reduce energy usage.

 

Other than standards related to fire resistance and brief mentions of minimizing moisture migration out of humidified areas, most of the current documentation on data center design does not address how the building envelope that surrounds the computers affects the temperature, humidity, and energy use. To address what role the envelope plays, the following questions need to be answered:

 

• Does the amount of leakage across the building envelope correlate to indoor humidity levels and energy use?

 

• How does the location (climate) of the data center affect the indoor temperature and humidity levels? Are certain climates more favorable for using an outside air economizer without humidifying the air when outdoor air is dry, or introducing an unnecessary latent load in humid climates?

 

• When using an outside air economizer cycle, how dry will the air in the data center actually get without adding humidification? How much energy can be saved at this extreme?

 

• Will expanding the humidity toleran-ces required by the data center equipment yield worthwhile energy savings?

 

Performance simulations

Typical analysis techniques look at peak demands or steady-state conditions that are just representative “snapshots” of data center performance. These analysis techniques, while very important for certain aspects of data center design such as equipment sizing, do not tell the engineer anything about the dynamics of indoor temperature and humidity, some of the most crucial elements of successful data center operation.

 

For the purposes of demonstrating these interdependencies, a simulation model was developed using EnergyPlus version 2.1, an energy analysis and thermal load simulation program. The simplified model has 20,000 sq ft. of raised floor area at 150 W/sq ft. The systems were modeled as chilled water with centrifugal water-cooled chillers and variable air volume air handling units. The leaving air temperature from the coil is 55 F with a maximum return air temperature of 75 F.

 

The building envelope comprises the roof, exterior walls, floors, and underground walls in contact with the earth, windows, and doors. Many data center facilities have minimal areas of windows and doors, so the roof, walls, and floor are the primary elements for envelope modeling. The parameters to be considered in the analysis of these elements are: thermal resistance (insulation), thermal mass (heavy construction such as concrete versus light-weight steel), air-tightness, and moisture permeability.

 

When a large data center is running at full capacity, the affects of the building envelope on energy use (as a percent of the total) are relatively minimal. However, because many facilities never reach their full build-out potential, or if they do, it is over an extended period of time, defining the insulation and sealing requirements of the building envelope need to be an integral part of the design process.

 

When analyzed over time as the facility load increases, the envelope losses start out as a significant component of the overall cooling load, but decrease over time as the computer load becomes a greater portion of the load, as shown in Table 1 .

 

ASHRAE Energy Standard 90.1-2007 has very specific information on different building envelope alternatives that can be used to meet the minimum energy performance requirements. Additionally, ASHRAE’s “Advanced Energy Design Guide for Small Office Buildings” goes into great detail on the most effective strategies for building envelope design by climatic zone. Finally, another good source is the British Chartered Institution of Building Services Engineers (CIBSE) “Guide A on Environmental Design.”

 

Building envelope leakage

Higher than expected heat and moisture migration across the building envelope will impact the internal temperature and RH by increasing the enthalpy of the indoor air. The primary driving mechanism is outside air infiltration, while moisture migration driven by vapor diffusion (from high to low vapor pressure areas) accounts for only small a fraction. Depending on the climate, building leakage can increase the energy use of the facility and increase or reduce too much the indoor moisture content of the air.

 

National Institute of Standards and Technology, CIBSE, and ASHRAE studies investigating leakage in building envelope components show that building leakage is underestimated by a significant amount. Also, there is not a consistent standard on which to base building air leakage.

 

In the absence of a well-controlled building pressurization system, air will pass into the interior of the building through cracks and other unsealed openings in the building envelope. The primary areas of building leakage tend to be at gaps around windows and doors, joints in building fa%%CBOTTMDT%%ade elements, wall/roof junctures, and other areas where it is difficult to develop an air-tight seal. Knowing that the actual air amounts that enter (or exit in some cases) the building are mainly related to pressure differences driven by wind, the following items need to be taken into consideration:

 

  • There is a high correlation between leakage rates and fluctuations in indoor RH—the greater the leakage rates, the greater the fluctuations.

  • There is a high correlation between leakage rates and indoor RH in the winter months—the greater the leakage rates, the lower the indoor RH.

  • There is low correlation between leakage rates and indoor RH in the summer months—the indoor RH levels remain relatively unchanged even at greater leakage rates.

  • There is a high correlation between building leakage rates and air change rate—the greater the leakage rates, the greater the number of air changes due to infiltration.

Expanding the range

So what if the range of humidity was expanded to allow for higher and lower moisture levels in the data center? Would this capture even more savings? To answer this, engineers performed an analysis with Chicago as a representative climate. This type of climate can yield large energy savings, but also results in many hours that fall below the ASHRAE recommended minimum humidity levels.

 

To gain a better understanding of the energy savings potential, a parametric analysis was conducted to evaluate how different indoor design temperatures affects both energy use and indoor humidity levels; see Figure 1 for details.

 

Based on this analysis:

 

  • Designing for higher discharge air temperatures (cold-aisle) and higher return-air temperatures (hot-aisle) will result in significant annual electricity savings.

  • To achieve the maximum projected savings, the indoor humidity levels must be allowed to fall far below current recommendations of ASHRAE and other industry organizations.

  • Using traditional isothermal humidification or more energy efficient adiabatic humidification processes, the lower limit humidity level can be increased to an acceptable level, but the potential energy savings will be reduced.

One conclusion drawn from the modeling described in this article is that there needs to be further research on the leakage of data center facilities. From state-of-the-art facilities to data-center spaces built within existing office buildings, and data centers constructed within converted warehouses and industrial buildings, there is likely a spectrum of leakage rates and causes.

 

Compared to state-of-the-art facilities, leakage is probably worse in converted manufacturing or warehouses spaces, where envelope integrity may not have been a priority during the design and construction of the original facility. These facilities could potentially be the most challenging in maintaining indoor temperature and humidity levels. Knowing more about the scope and scale of data center leakage could encourage owners to tighten up their facilities and to budget more for better envelopes on new construction and conversions.

 

Another important area of research required comes from the computer manufacturers. Building owners, CFOs, and CIOs need accurate data as to the trade-offs that come from expanding the temperature and humidity tolerances related to possible decreased lifecycle of computer equipment compared to annual electrical utility savings. With this data, it will be possible to accurately develop lifecycle cost analysis and reliability studies.

 

Percent of computer equipment running

Envelope losses as a percent of total cooling requirements

Table 1: This example shows how a building’s envelope cooling changes as a percent of total cooling load.
Source: EYP Mission Critical Facilities

20%

8.2%

40%

4.1%

60%

2.8%

80%

2.1%

100%

1.7%

What is NEBS? Understanding data center requirements.

Network Equipment-Building System, or NEBS, is a set of technical requirements and objectives that originally were developed by AT&T’s Bell Labs under the title of Technical Publications with the purpose of making network switches robust and reliable. After the 1984 divestiture of AT&T, NEBS ownership passed on to Bellcore (currently Telcordia), the research arm of Regional Bell Operating Companies (RBOC), for maintenance and upgrades. Since then Telcordia has renamed the publications Generic Requirements (GR) and published many new ones.

 

The NEBS requirements and test methods are specified in the Telcordia GR-63-CORE, GR-1089-CORE, GR-3109, and GR-3208 documents. Table 2 lists the general topics of each document. Compliance to NEBS is demonstrated by testing products in a Nationally Recognized Testing Laboratory (NRTL) sanctioned by RBOCs or in the manufacturer’s test facility, supervised and witnessed by an NRTL. Depending on the application and the customer, compliance can be demonstrated on a tiered system defined in the Telcordia SR-3580 document as NEBS levels 1, 2, and 3, where level 3 is for Central Office grade equipment. NEBS level 1 is mandated by the Federal Communication Commission for co-location of Competitive Local Exchange Carrier equipment in the Incumbent Local Exchange Carrier spaces.

 

GR-63-CORE Issue 2

 

• Fire resistance

 

• Temperature and humidity (operating, storage, and transportation)

 

• Shock, vibration, and earthquake (operating, storage, and transportation)

 

• Airborne contaminates (corrosive and hygroscopic dust)

 

• Acoustic

 

• Altitude

 

• Lighting

 

• Floor loading, physical and spatial requirements

 

GR-1089-CORE Issue 3

 

• Electrical safety

 

• EMI emissions

 

• EMI immunity

 

• Lighting immunity

 

• ESD (operation and installation)

 

• Bonding and grounding

 

• AC power fault

 

GR-3109

 

• Fire resistance

 

• Temperature and humidity (operating, storage, and transportation)

 

• Shock, vibration, and earthquake (operating, storage, and transportation)

 

• Airborne contaminants (corrosive and hygroscopic dust)

 

• Salt fog exposure

 

• Acoustic

 

• Altitude

 

• Lighting

 

• Applicable GR-1089 requirements

 

GR-3208

 

• Equipment cooling classifications

 

• Room cooling classification

 

• Environmental criteria

 

• Heat release targets

 

• Equipment specifications

 

Table 2: Source: “Thermal Design and NEBS Compliance,” Majid Safavi, Lucent Technologies. Electronics Cooling, February 2006.

 

Location plays a key role  

Climate, indoor environmental conditions, and energy use heavily impact a facility’s energy bills.

The data center’s location has a significant impact on the annual electricity usage. This comes primarily from the changes in outside dry bulb temperature, the outside air moisture content, solar heat gain, wind speed and direction, and the daily temperature range and that changes from location to location.

 

While using an outside air economizer can create significant energy reduction potential, if implemented incorrectly an outside air economizer can negatively affect the indoor air moisture content. The strategy is to find the optimal mix to reduce energy use as much as possible while, at the same time, avoiding too low or too high moisture levels within the data center. Studies suggest that data centers should have annual swings in RH levels of 30% to 70%, rather than the 40% to 55% RH suggested by ASHRAE, resulting in energy-efficient operations while still providing acceptable environment for the computer equipment. While this range might be possible for some data center users, others data center users may have much tighter tolerance requirements based on preference or existing service level agreements.

 

Normal air conditioning systems generally will keep the upper end of the humidity in a reasonable range; the lower end becomes problematic, especially in cold, dry climates where there is great potential in minimizing the amount of hours that mechanical cooling is required.

 

The other important aspect of this topic is using humidity metrics such as humidity ratio to describe the humidity requirements. RH is just that—relative. It is relative to a specific dry bulb temperature where humidity ratio (unit moisture per unit dry air) is an absolute value and does not change relative to the dry bulb temperature.

 

The resulting data from the analysis demonstrates some key points about climate and the use of outside air economizer:
* In climates similar to Singapore, there will be some energy savings from using outside air economizer and the resulting indoor RH will be fairly consistent.

* In climates similar to London, outside air economizer results in significant energy savings and produces relatively few that fall below the ASHRAE recommended humidity levels.

* In climates similar to San Jose, Calif., outside air economizer results in an optimal mix of energy savings and produce very few hours that fall below the ASHRAE recommended minimum humidity levels.

* In climates similar to Bangalore, India, outside air economizer results in an optimal mix of energy savings and produce very few hours that fall below the ASHRAE recommended minimum humidity levels.

* In climates similar to Chicago, outside air economizer will yield significant energy savings but will result in many hours below the ASHRAE recommended minimum humidity levels.

* In climates similar to Reykjavik, Iceland, outside air economizer will yield significant energy savings but will result in the greatest amount of hours below the ASHRAE recommended minimum humidity levels.

 

 

 

Author Information

Bill Kosik is Chicago managing principal with CEM 



No comments
The Top Plant program honors outstanding manufacturing facilities in North America. View the 2013 Top Plant.
The Product of the Year program recognizes products newly released in the manufacturing industries.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
The true cost of lubrication: Three keys to consider when evaluating oils; Plant Engineering Lubrication Guide; 11 ways to protect bearing assets; Is lubrication part of your KPIs?
Contract maintenance: 5 ways to keep things humming while keeping an eye on costs; Pneumatic systems; Energy monitoring; The sixth 'S' is safety
Transport your data: Supply chain information critical to operational excellence; High-voltage faults; Portable cooling; Safety automation isn't automatic
Case Study Database

Case Study Database

Get more exposure for your case study by uploading it to the Plant Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.

These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.

Click here to visit the Case Study Database and upload your case study.

Maintaining low data center PUE; Using eco mode in UPS systems; Commissioning electrical and power systems; Exploring dc power distribution alternatives
Synchronizing industrial Ethernet networks; Selecting protocol conversion gateways; Integrating HMIs with PLCs and PACs
Why manufacturers need to see energy in a different light: Current approaches to energy management yield quick savings, but leave plant managers searching for ways of improving on those early gains.

Annual Salary Survey

Participate in the 2013 Salary Survey

In a year when manufacturing continued to lead the economic rebound, it makes sense that plant manager bonuses rebounded. Plant Engineering’s annual Salary Survey shows both wages and bonuses rose in 2012 after a retreat the year before.

Average salary across all job titles for plant floor management rose 3.5% to $95,446, and bonus compensation jumped to $15,162, a 4.2% increase from the 2010 level and double the 2011 total, which showed a sharp drop in bonus.

2012 Salary Survey Analysis

2012 Salary Survey Results

Maintenance and reliability tips and best practices from the maintenance and reliability coaches at Allied Reliability Group.
The One Voice for Manufacturing blog reports on federal public policy issues impacting the manufacturing sector. One Voice is a joint effort by the National Tooling and Machining...
The Society for Maintenance and Reliability Professionals an organization devoted...
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
Maintenance is not optional in manufacturing. It’s a profit center, driving productivity and uptime while reducing overall repair costs.
The Lachance on CMMS blog is about current maintenance topics. Blogger Paul Lachance is president and chief technology officer for Smartware Group.