Cooling a data center requires careful analysis and proper design techniques. Climate-sensitive environments such as data centers, computer rooms, medical facilities and telecommunications hubs, must be properly cooled to ensure ongoing operation and equipment maintenance. Without establishing and maintaining specific temperature levels, equipment will fail. To ensure continuous data center availability, optimization of the right components is necessary. Cooling a data center is not just about installing an air conditioning system. The project is much more involved and must consider things like equipment options, air flow dynamics, facility structure, floor construction, hot and cold aisle containment options and environmental factors.
The appropriate design for cooling a data center should be based on a thorough analysis, company objectives and budget constraints. Numerous options exist for computer-grade environmental control. The best fit is dependent on the individual circumstances of the company and its data center operations. Data center cooling systems come in various manufacturer packaged and non-packaged systems. Typical systems installed include process cooling, humidity control systems, cooling towers, centralized pump stations, chillers and system monitoring and controls. However, cooling a data center doesn’t stop with a system installation. Just as important is the ongoing maintenance of the cooling equipment. Preventing failures is the key to systems availability and the costly consequences of downtime. How to Increase Cooling Efficiencies Industry experts regularly talk about the rising cost of electricity and the increasing use of power consumption in the data center. For cost containment and green initiative purposes, company executives and data center managers are placing a premium on efficiency. Data center cooling is a prime target for reducing energy costs. Some industry estimates place cooling consisting of approximately 37 percent of power usage. If the data center is not designed properly, that percentage, along with associated costs, can increase drastically. Data center operators and designers can adopt certain strategies to help control energy consumption related to cooling. Some of these strategies include the following: Sealing the environment. Just like in a home, cool air escapes through vulnerabilities in floors, windows, doors, walls and ceilings. The same challenge exists in an data center. The facility must be properly sealed and protected from the outside environment to avoid cooling losses. Doors need to be kept closed as a starter when cooling a data center. Then, data center designers and operators can create a “vapor seal.” This seal is a covering of materials such as plastic film, paint and vinyl used on walls, floors, doors and any facility opening. Moisture cannot be allowed to “migrate” to or from the data center environment. Promoting Cooler Air Movement. Data center designers and operators must make sure air flows efficiently throughout the facility. Optimal air flow will go a long way in cooling a data center. Where components, such as racks, air conditioners and cables, are placed is an initial step. The basic concept in promoting air flow is to move as much heat as possible away from equipment. The goal is to accomplish this efficiently using the least amount of energy. One way data centers reduce energy usage is by using technology called “economizers.” During colder weather, the underlying concept is to bring outside air inside for cooling a data center. This saves significant energy usage and costs.
Another way to improve cooling efforts is to ensure the air conditioning equipment is operating for maximum efficiency. Technological advances continue to improve equipment operation. Improvements like variable capacity and better controls optimize air conditioning equipment operation. Installing Supplemental Cooling Systems. To counter the effects of heat at its source, additional cooling systems can be installed nearby. By cooling a data center at heat sources, less overall energy is required. This approach is especially useful for high rack densities. For example, equipment in the bottom rack consumes more cold air, leaving equipment at the top sometimes insufficiently cooled. In fact, an Uptime Institute study found equipment located in the top third of a rack failed two-times as much. Energy costs continue to be a major area of concern for data center designers and operators. And, with cooling a data center comprising a significant portion of these energy costs, special attention must be placed on capacities and efficiencies of cooling equipment to offset these costs. Plus, effective cooling strategies help keep equipment running and systems available.