Call Us: 703-715-4930

The Datacenter – Part 5: Cooling and Energy Efficiency

Tier 4 datacenters have at least two of everything. From servers, switches, and firewalls to generators, hard drives, and power outlets, there can be no single point of failure in a Tier 4 datacenter. All this equipment creates a lot of heat, the enemy of electronics. Those spinning disks need fans to absorb the heat and push it away from the equipment. Cooling is one of the biggest expenses a datacenter has. A Tier 4 datacenter isn’t exactly the most energy-efficient building out there, but you’ve already heard again and again that you can’t cut corners to achieve the highest tier rating.

The Uptime Institute doesn’t have any specifications or criteria for power usage or environmental requirements. They don’t care how much power you use; they’re only concerned with full redundancy, power capacity, and (in Tier 4 cases) fault tolerance. There’s simply no getting around the need for massive electric power. There are, however, some pretty simple ways to make datacenters cooler without burning up the power grid. A Computer Weekly article from 2010 has one simple suggestion that’s still a pretty good idea: Build a datacenter in a cold location. “It costs a lot more to cool a facility in Dubai than in the UK.” The cooler climate allows for natural cooling inside the building, requiring fewer fans and air conditioners—and lower monthly expenses!

Heat is the enemy of the datacenter. (Source: Stulz)

Using less electricity means spending less money—savings that can be passed on to clients. Most articles about green-building datacenters suggest grouping like things together (e.g., cables and even server racks) for the cooling to be more direct and the heat removal centralized. A room full of servers spread out in a room needs more fans and air conditioners. Keeping the servers close together lets you target your cooling efforts with fans directed at the hottest spots.

The average datacenter runs around 70 degrees Fahrenheit. Google’s datacenters temps run as high as 80°. There’s a delicate balance, because overcooling can cause moisture buildup in high-humidity areas. Of course, too hot is even worse because overheated equipment eventually fails. Most CPUs average around 122°, and larger datacenters may have thousands of CPUs buzzing all the time! So, keeping the room at just the right temperature  is a big challenge. Too cold, you get condensation; too hot, you fry your hardware.

Whether it’s local cooling efforts, like “hot and cold aisles,” where server fans face each other and allow one exhaust fan to remove all the heat or natural cooling caused by geographic location, there are many different approaches to cooling a building in a cost-efficient way. Save yourself the hassle of planning data storage and the high electricity bill involved. Let Network Alliance and our Tier 4 datacenter handle it for you! Give us a call to see what we can do for you.



 3CX Partner      Cisco Partner






Drupal 7 Appliance - Powered by TurnKey Linux