You are on page 1of 3

Just how hot is green nowadays?

Clive Longbottom, Service Director

Quocirca Comment
It may seem that the drive for a green, sustainable data centre has died off somewhat from a few years back when it seemed the push was on to be seen to be doing everything possible to save the planet. However, even when it is not wrapped up in a sustainability message, many organisations are implementing green policies but for the very reason of saving money through optimising energy usage, not due to any real desire for being seen to be green. One of the main drivers behind the need to cut back on energy usage is the obvious one energy prices are unlikely to come down in the foreseeable future: indeed, they are trending ever upwards and data centres use a lot of energy. The other driver in the UK that is beginning to have impact is the Carbon Reduction Commitment Energy Efficiency Scheme, commonly known as the CRC. The CRC was originally conceived as a net cash neutral scheme, where those who could demonstrate that their usage of energy was being optimised would gain money and those who showed little to no improvement who would lose money. However, the CRC has changed to being a straight tax on all organisations caught in its web on how much energy they use. Currently, the CRC applies only to those organisations who use large amounts of electricity but as the government searches for new tax revenues, it is likely to be expanded in scope to cover more organisations over time. The drive for energy efficiency in the data centre should therefore be even stronger. Many existing data centres are being run against old-style environmental designs, where the approach to cooling is based around ensuring that input cooling air is at such a low temperature that outlet air does not exceed a set temperature. In many cases, the aim has been to keep the average volumetric temperature in the data centre around 20C or lower, with some running at between 15-17C. A data centre with a floor area of 1,000 m and a ceiling-to-floor height of 3m will require the cooling of 3,000 m of air. To ensure that the average temperature remains within limits, the air will need to be flowing, and this leads to a measure called air change rate (ACR). Many data centres will be working at an ACR of around 100-200 per hour, leading to a need for up to 600,000 m of air to be cooled per hour. The costs of cooling so much air to well below standard air temperature can be enormous but is it really required? The first thing that needs to be done is to assess the existing data centre. The best way to do this is to implement temperature monitoring systems around the data centre. By this, Quocirca does not mean just using thermometers the use of infra-red heat cameras will help in identifying where the data centre has existing hot spots that need addressing. Once the existing environment is mapped out and heat issues identified, it becomes possible to start to re-plan the data centre. It may be that a rack has been filled with 1 or 2U servers and the density of hot CPUs is leading to issues. It may be possible to spread these servers over two or more racks, or to mix the servers with lower heat creating items such as solid state storage or network termination boards so that there are fewer hot spots being created. It may become apparent that certain items of equipment have very high thermal issues. In most cases, this will be because the equipment is old (i.e. more than 3 years old). It will be cost effective in many cases to replace such equipment with a more modern equivalent. Firstly, the newer equipment will be more energy efficient due to improvements in design and engineering. Secondly, it will have fewer thermal problems again, down to improvements in chip and engineering design. Next is to look at how cooling can be best implemented. In the example of the 1,000 m data centre, there is a lot of air being cooled that is doing little in the way of cooling the IT

Just how hot is green nowadays?

http://www.quocirca.com

2013 Quocirca Ltd

equipment. By containing the cooling air flows and directing it where it is most needed, much higher efficiencies can be obtained. The use of hot and cold aisles makes this possible and it doesnt necessarily require large investment in the technology. In many cases, it is possible to take existing racks and use polycarbonate sheeting over the aisles with plastic doors at the end of the rows to create a sufficiently contained environment to bring the volume needed to be temperature controlled down by an order of magnitude. If the racks have been redesigned to ensure that there are fewer hot spots, the use of blanking plates and flow-direction plates can ensure that cooling air is directed exactly where it is most needed, lowering the ACR and saving further energy. This is where computation fluid dynamics (CFD) comes in useful. Being able to map air flows and to carry out what-if? scenarios enables air flows to be optimised to ensure that cooling air hits the hotter areas of IT equipment effectively, and that cooling air is not wasted by flowing over other areas. CFD software is increasingly included in data centre infrastructure management (DCIM) suites from the likes of nlyte, Emerson Network Power, Eaton and others. Further savings can be made by reassessing the thermal envelope in which the data centre works. The American Society of Heating, Refrigeration and Air conditioning Engineers (ASHRAE) provide guidelines on best practice for data centre thermal operation, and these guidelines have been changing over the past few years. ASHRAEs original 2004 guidelines aimed for a maximum allowable temperature of around 25C. By 2008, this had risen to around 27C. ASHRAEs 2011 guidelines created a range of different approaches depending on type of data centre and acceptable equipment failure rates, but moved the upper acceptable temperature as high as 45C for less controlled environments, with more controlled data centre environments being able to be run at 35C. At these temperatures, the amount of air mechanically cooled through the use of computer room air conditioning (CRAC) units is massively reduced. In many cases, the need for CRAC units is completely obviated, and free air cooling can be used instead.

The cost of CRAC units should not be underestimated. A measure of the overall energy effectiveness of a data centre is power utilisation effectiveness (PUE). This is a number derived by taking the total energy used by a data centre and dividing it by the amount of energy used for enabling a useful workload that is, the energy that is provided to the IT equipment within the data centre. For many existing data centres, the PUE will be between 2 and 2.5 for every Watt of energy provided to the IT equipment, between 1 and 1.5 Watts of energy is used in peripheral equipment on the whole, lighting, uninterruptable power supplies (UPSs) and cooling. Lighting is a small part of this, and a modern data centre should be run in a lights out status anyway. A modern UPS should be greater than 95% energy efficient (many are now 98%+ energy efficient), leaving the cooling system as the biggest energy drain for calculating PUE. If the CRAC units can be decommissioned and replaced with free air cooling, or with lower energy systems such as adiabatic cooling, the data centres PUE will improve dramatically and the energy requirements for the data centre overall will fall. For example, if an existing facility has a PUE of 2.5 and through the use of advanced cooling design it can drop to 1.5, 40% of the total energy used by the data centre can be saved. Although the green data centre is probably not at the top of anyones priority list at the moment, energy optimisation probably is. Through a few relatively simple steps such as those above, a step change in data centre energy usage can be gained. Through saving tens of percentage points on the data centre energy cost, the business will gain. Further, not only will it be able to show this saving directly against the bottom line, but it will also be able to use the saving to put the tick-mark in the green and sustainable boxes in its corporate responsibility statement (CSR). This article first appeared http://www.computerweekly.com on

Just how hot is green nowadays?

http://www.quocirca.com

2013 Quocirca Ltd

About Quocirca
Quocirca is a primary research and analysis company specialising in the business impact of information technology and communications (ITC). With world-wide, native language reach, Quocirca provides in-depth insights into the views of buyers and influencers in large, mid-sized and small organisations. Its analyst team is made up of realworld practitioners with first-hand experience of ITC delivery who continuously research and track the industry and its real usage in the markets. Through researching perceptions, Quocirca uncovers the real hurdles to technology adoption the personal and political aspects of an organisations environment and the pressures of the need for demonstrable business value in any implementation. This capability to uncover and report back on the end-user perceptions in the market enables Quocirca to advise on the realities of technology adoption, not the promises. Quocirca research is always pragmatic, business orientated and conducted in the context of the bigger picture. ITC has the ability to transform businesses and the processes that drive them, but often fails to do so. Quocircas mission is to help organisations improve their success rate in process enablement through better levels of understanding and the adoption of the correct technologies at the correct time. Quocirca has a pro-active primary research programme, regularly surveying users, purchasers and resellers of ITC products and services on emerging, evolving and maturing technologies. Over time, Quocirca has built a picture of long term investment trends, providing invaluable information for the whole of the ITC community. Quocirca works with global and local providers of ITC products and services to help them deliver on the promise that ITC holds for business. Quocircas clients include Oracle, IBM, CA, O2, T-Mobile, HP, Xerox, Ricoh and Symantec, along with other large and medium sized vendors, service providers and more specialist firms.

Full access to all of Quocircas public output (reports, articles, presentations, blogs and videos) can be made at http://www.quocirca.com

Just how hot is green nowadays?

http://www.quocirca.com

2013 Quocirca Ltd

You might also like