Would you prefer to visit the site in this language?
Founded in 1981 and acquired by Siemens in 2017, Mentor Graphics Corporation (MGC) is a leader in Electronic Design Automation (EDA) technology. To improve efficiency and meet growing demand, MGC decided to consolidate several small data centers into a single, larger data center. This new, state-of-the-art data center was built at company headquarters, located in Wilsonville, Oregon, roughly 18 miles south of downtown Portland.
Quick facts
Maximize efficiency, minimize cost
With many data centers operating all over the world, MGC facilities engineers worked with a variety of energy efficient design strategies. They knew they needed to focus on three key design aspects to maximize efficiency and minimize heat rejection cost: fan energy, hot-aisle containment and refrigeration cooling minimization. “In the past, data center equipment airflow and temperature requirements were like air-conditioning equipment, so air-conditioning was chosen to fill the need. Today, server needs are very different from the needs of people, so air-conditioning equipment struggles to operate correctly in a data center environment,” said John Wozniak, MGC’s Critical Infrastructure Technician.
MGC uses FloVENT modeling software
“While the temperature requirement of the ‘intake’ of servers is currently like the ‘intake’ of humans, the exhaust temperatures from servers is radically different (much hotter). The approach we took was to implement a design tailored around servers, but also comfortable for humans. Since we couldn´t find precedent for this approach, we relied heavily on Mentor Graphics’ FloVENT computational fluid dynamics (CFD) modeling software to validate the design,” concluded Wozniak Chimney cabinets selected MGC engineers used their own CFD software and in-house expertise to model and optimize the air distribution system. Mentor Graphics selected chimney cabinets with integral electronically commutated (EC) variable speed exhaust fans to house their servers. These cabinets, along with return air ductwork, provide excellent containment of the hot server exhaust all the way to the rooftop air handlers. Although active chimney cabinets cost more than some other types of hot-aisle containment, they often permit the implementation of a cost-efficient supply air duct design. For this project, the supply duct from the roof top air handlers terminated high up in the room, with only minimal diffusers under each duct. The variable speed chimney cabinet fans respond to server loading, speeding up or slowing down to maintain a slight negative pressure at the cabinet exhaust plenum (server outlet), keeping the hot exhaust contained. The outstanding containment of hot server exhaust allowed Mentor Graphics to raise the design room temperature to 72–74°F (22.2–23.3°C), resulting in better operating efficiency of the cooling equipment and downsizing of supplemental mechanical cooling. Munters Oasis air handling units selected After evaluating numerous cooling system methods, layouts, and manufacturers, a decision was made to use rooftop-mounted air-handling units manufactured by Munters. Rooftop units maximized the amount of indoor floor space available for the new IT equipment, and they were better suited for the flooded room air delivery system that was ultimately chosen. Munters’ Oasis air handlers operate using the principle of Indirect Air-Side Economization (IASE), where outdoor air is used to reject heat from a recirculating data center airstream by way of an air-to-air heat exchanger. With this approach, there are two separate airstreams. The first airstream is the recirculating air from the data hall. This air enters air handlers warm, after server heat pickup, and must be cooled before delivery back to the room. The second airstream is outdoor air, referred to as scavenger air. This air is drawn over the opposite side of the air-to-air heat exchanger by separate variable speed scavenger fans to extract heat from the warmer recirculating data hall air. With IASE systems, only a small amount of make-up air, as required for proper ventilation and space pressurization, is introduced into the data hall. Since the cooling units simply extract heat from a recirculating airstream, they don't impact room humidity levels, and risk of ambient pollutants impacting the servers is greatly mitigated compared to direct air-side economizer designs. Munters solutions Munters’ polymer tube heat exchanger was selected for the job because of its high efficiency, resistance to corrosion, and the inherent scale resistance of its flexible polymer tubes. The Oasis data center units typically operate in dry mode when ambient conditions are about 40°F (4.4°C) and lower, and in wet indirect evaporative cooling (IEC) mode in warmer ambient conditions. During IEC mode, water from a welded stainless-steel sump located beneath the heat exchanger is circulated through piping up to the top of the heat exchanger, where the water then falls via gravity over the exterior of the polymer tubes. Outdoor scavenger air is drawn over the exterior of the tubes, leading to evaporation of water and enhanced heat removal from the warmer data center air that flows through the inside of the tubes. Only during the higher wet bulb ambient conditions, where the wet bulb temperature exceeds 68–70°F (20–21.1°C), is mechanical “trim” cooling required to meet the target air delivery temperature of 72–74°F (22.2–23.3°C). Trim cooling coils typically provide a mere few degrees of cooling to reach set point. For the mechanical trim cooling, MGC selected an air-cooled chiller, with chilled water coils installed in the Munters air handlers, located downstream of the polymer HX. Operating efficiencies exceed expectations Munters air handling units were customized to provide optimal fan efficiency and heat rejection components. For the recirculating (supply) airstream, a fan array using direct drive plenum fans was selected with EC motors (variable speed), each with an inlet back-draft damper. The fan array was configured to provide N+1 redundancy at fan level. The supply fan motors are controlled to provide a slight negative pressure in the return duct. The intent is for the flow of the supply fans in the air handlers to precisely match the flow of air exhausted from the chimney cabinets. The result is optimal efficiency when circulating the cooling air, which for economizer-cooled data centers is the single greatest power consumer. The air handlers were delivered to the site and commissioned shortly thereafter. Designed to meet a seismic importance factor of Ip=1.5, the air handlers are anchored to concrete roof curbs. During the first year of operation, the system maintained server inlet temperatures throughout the room within +/-1°F. Operating efficiencies have exceeded Mentor Graphics’ expectations, especially in the early stages of operation when data centers are typically the least efficient. Initially, the team’s target was 5,000 hours per year (57%) of economizer cooling using no compressors. The Munters air-handling units were able to economize for nearly 8,000 hours (>90%) during the first year of operation. “By using quality equipment and having a detailed design process, we have been efficient from day one, and from this point on, we are saving money, energy, and everything is working correctly,” said Wozniak.
Quick facts