Many small and midsized companies often see their data center cooling systems pushed to the limit or even beyond. Sometimes there are just peak periods of the day or just the fact that we have a few hot days of weather.
Here are a few tips, tricks and techniques that may not solve any long-term problems, but may help enough to get you through the peaks. Many times, when the actual capacity of the cooling system is not severely exceeded by the actual heat load of the equipment, optimizing the airflow may improve the situation until a new or additional cooling system is installed.
- Take temperature measurements at the front of the servers. This is where the servers draw in the cool air and is probably the most important reading. Take readings at the top, middle and bottom of the front of the racks (assuming that you have a hot aisle – cold aisle layout). The top of the rack is usually the highest. If the bottom areas of the racks are cooler, and you have open rack space, try to re-arrange the servers nearer the bottom (or coolest area) of the racks.
- Don’t worry about rear temperatures, even if they are 100°F or more! This is not unusual. Do not place fans blowing at the rear of racks to “cool them down” – this just causes more mixing of warm air into the cold aisles – we see this mistake so many times.
- If you have a raised floor, make sure that the floor grates or perforated tiles are properly located in front of where the hottest racks are. If necessary re-arrange or change to different floor grates to match the airflow to the heat load. Be careful not to locate floor grates too close to the computer room air conditioning as this will “short circuit” the cool air flow immediately back into the air conditioning and deprive the rest of the room/row of sufficient cool air.
- Check the raised floor for openings inside the cabinets. Cable openings in the floor allow air to escape the raised floor plenum were it is not needed, and lowers the available cold air to the floor vents in the cold aisles. Use air containment brush type collar kits to minimize this problem.
- If possible, try to re-distribute and evenly spread the heat loads into every rack to avoid or minimize “hot spots.” Remember to check the temperature in the racks at the top, middle and bottom, before you move the servers. Only relocate the warmer servers (based on the front temperatures) to a cooler area. Then use blanking panels to fill any gaps. Always recheck all the rack temperatures to make sure that you have not just created new hot areas.
- Check the rear of racks for cables blocking exhaust airflow. This will cause excessive backpressure for the IT equipment fans and can cause the equipment to overheat, even when there is enough cool air in front. This is especially true of racks full of 1U servers with a lot of long power cords and network cabling. Consider using shorter network cables and purchasing shorter power cords to replace the original longer cords shipped with most servers. Use cable management to unclutter the rear of the rack so that the airflow is not impeded.
- If you have an overhead ducted cooling system, make sure that the cool air outlets are directly over the front of the racks and the return ducts are over the hot aisles. We have seen sites where the ceiling vents and returns are poorly located, the room is very hot, yet the capacity of the cooling system has not been exceeded simply because the cool air is not getting directly to the front of the racks or the hot air is not properly extracted.
- The most important issue is to make sure the hot air from the rear of the cabinets can get directly back to the computer room air conditioning return, without mixing with the cold air. If you have a plenum ceiling consider using it to capture the warm air and add a ducted collar going into the ceiling from your computer room air conditioning top return air intake. Some basic ductwork will have an immediate impact on the room temperature. In fact the warmer the return air, the higher the efficiency and actual cooling capacity of the computer room air conditioning.
- Consider adding temporary portable cooling units only if you can exhaust the heat into an external area. Running the exhaust ducts into a ceiling that goes back to the computer room air conditioning does not work. The heat exhaust ducts of the portable cooling units must exhaust into an area outside of the controlled space.
- When the room is not occupied, turn off the lights. This can save 1% to 3% of electrical and heat load, which in a marginal cooling situation, may lower the temperature 1° to 2°.
- Check to see if there is any equipment that is still plugged in and powered up, but is no longer being used. This is a fairly common situation in many IT departments and has an easy fix – just shut it down!
While there is no true quick fix when your heat load totally exceeds your cooling system’s capacity, sometimes just improving the airflow may increase the overall efficiency by 5% to 20%. This may get you though those peak days, until you can upgrade your cooling systems. In any event, it will lower your energy costs, which is always a good thing.
Plan ahead and if all else fails, find out which of the systems is least critical and plan to shut them down, so that the more critical servers can remain operational. Make sure to locate the most critical systems in the coolest area. This is better than having the most critical systems unexpectedly shutdown from overheating.