นับวันก็ยิ่งก้าวหน้ามากขึ้นๆ นะครับ กับผู้ให้บริการเกมส์ออนไลน์รายใหญ่รายนี้ อ้าวเย้ย ไม่ใช่ Facebook ครับ พบกับกลไกการลดค่าไฟฟ้าจำนวนมหาศาล ที่ใช้ไปกับ DC ของ Facebook
CFD Modeling: The first step in Facebook’s efficiency project was creating a thermal model of the data center using computational fluid dynamics (CFD) software that creates a 3D model of how cold air is moving through the facility, identifying potential “hot spots” as well as areas that are receiving more cold air than needed, wasting cooling and energy. The CFD study revealed that some of the cold air entering the room through the raised-floor was bypassing servers, cooling the room rather than the IT equipment, while warm exhaust air from the hot aisle was mixing with cold air in key areas.
Cold Aisle Containment: Facebook took several steps to address the airflow problems identified in the CFD modeling. It began by installing a cold aisle containment system to isolate the hot and cold air in the data center. Roof panels were installed over the cold aisles, with fusible links to allow for adequate overhead fire suppression. Doors at each end of the aisle allowed access for tech staff. Facebook also took steps to seal every area where cold air could escape, using blanking plates, skirts for PDUs (power distribution units) and sealing cut-outs for cabling.
Reducing the Number of CRAH Units: Once the cold aisle was encapsulated, less airflow was required to cool the equipment. This allowed Facebook to turn off 15 computer room air handlers (CRAHs), saving the energy required to operate those excess units.
Reducing Server Fan Energy: Further savings were gained through adjustments to the server fans. “These fans are PWM fans – pulse with modulation,” Park explained. “They’re typically pre-set by the manufacturer to run at higher speeds. You modulate the fans to a lower speed and you bring less air through the servers. You can set this through software. Intel can tell you how to do this.”
Raising the Air Temperature: Facebook next concentrated on raising the rack inlet temperature as high as it could without triggering additional fan activity. Optimizing the cold aisle and server fan speed allowed Facebook to raise the temperature at the CRAH return from 72 degrees F to 81 degrees F.
Raising the Water Temperature: The higher air temperature then allowed Facebook to raise the temperature of the supply water coming from its chillers, requiring less energy for refrigeration. The temperature of chiller water supply was raised by 8 degrees, from 44 degrees F to 52 degrees F.
ที่มา: http://www.datacenterknowledge.com/archives/2010/10/14/facebook-saves-big-by-retooling-its-cooling/