Monday, 14 June 2010

New data centres chase PUE ratings

eBay has opened its new corporate data centre, code named Topaz.  With a budget of $287m it is the single largest infrastructure investment the company has made.  In 2009, the total worth of goods sold on eBay was over $60bn, over $2000 a second, so the data centre pretty much is the business.

The data centre, located outside Salt Lake City, Utah was designed to be concurrently maintainable and fault tolerant - Tier IV level.  It’s apparently 50% less expensive to operate than the average of all eBay’s other data centres and 30% more efficient than the best.  It’s designed to a PUE of 1.4, with rain water, collected in a 400,000 gallon cistern, the primary cooling source.  It uses a water side economiser, which means that outside air cools the data centre for more than half the year.

But being fault tolerant is a big issue.  In a blog on Dean Nelson, eBay’s senior director of Global Data Centre Strategy says “Now, I don’t want to go into the religious debate of who has the lowest PUE, but I do want to point one thing out.  In the business of on-line commerce, we do not have a choice but to build a highly available data centre to support our customers.  From my perspective, achieving a 1.4 PUE with a hard requirement to meet this level of redundancy is quite an accomplishment”.

Meanwhile, UK-based Keysource has built what it claims to be the UK's most efficient data centre in Weybridge, Surrey for Petroleum Geo-Services (PGS).  PGS was primarily looking for a design that would drive down energy usage and minimise running costs.  The data centre is the first facility to adopt Keysource’s ecofris technology, which it claims can reduce the energy cost of operating data centres by more than 45%.

Ecofris seems to be a about combining cooling solutions with the use of Computational Fluid Dynamics (CFD) software to ensure that the rack layout and server airflow requirements are optimally matched with the air handling equipment.  Anyway, the result is an annualised Power Usage Efficiency (PUE) measurement of 1.17 ‘significantly below the UK average of 2.2’.  Conversely, the facility also has a DCiE of 85%, which means that 85% of the total power being consumed by the data centre is powering the IT equipment.  After 12 months of operation, the data centre has achieved a 6.7 million kWh reduction in annual power consumption and 2.9 million kg reduction in annual CO2 emissions compared to PGS previous facility.


So there you are.  I make the comparison to the highlight the issues around data centre efficiencies.  For eBay, fault tolerance is high on the agenda, as you would expect, so you might also expect some overhead of energy use to maintain reliability.  For Keysource/PGS, efficient running is the core requirement and some of the efficiency gains seem to be down to attention to data centre airflow management, over and above the cooling used. 

So as well as a debate about how fault-tolerant a data centre can be and needs to be, there are many variables related to cooling sources, internal layout, use of waste heat, use of renewable energy, etc. that make comparisons between data centres, in terms of how environmentally-friendly they are, very problematic. 

Nevertheless, however crude the metrics they do provide a green IT measure against which data centre operators at least have to defend themselves, and which at best drives innovation.

© The Green IT Review

No comments:

Post a Comment