Roger Keenan, MD of City Lifeline

Member Article

Keep your data cool in the city

Modern business is totally dependent on computer and data communications technology. From email messaging, to IP-based telephony, to electronic document storage, to contact management through Facebook or cloud-based CRM systems, we all live in a world dependant on modern IT systems. They, in turn, depend on the physical environment in which they operate. That environment is essentially two things – reliable power supplies and a constant temperature.

Many organisations, in fact probably most, operate their own IT and communications systems on their own premises. In the old days, when IT was an adjunct to the business, that worked just fine. In today’s world, it can also be just fine, but only if it is properly and fully specified, installed, operated and maintained.

Temperature is the most difficult to deal with and the most common cause of failures. Often, equipment in a company’s computer area will work well for most of the time, but, as summer comes and temperatures go up, unexplained and erratic problems start to happen as the cooling systems reach capacity if they are not optimised to the thermal load. In extreme cases, such issues can lead to loss of data and corrupted databases which may take many months to sort out. As the outside temperatures go up, so the cooling systems have to work harder and harder to keep pace. Such problems may not become evident until there is a really hot day.

London has a benign and temperate environment. There are no major variations in temperature, such as are seen in, say, New York. For most of the year, the outside temperature is below the ideal operating temperature of most electronic equipment, which may be around 22 to 24 degrees C. Most of the time. But there can be significant exceptions. The highest temperature ever recorded in London was 37.6 degrees C on 10 August. Although London is typically 2 degrees C hotter than its surrounding environment, the highest temperature ever recorded in the UK was on the same day, 10 August 2003 at Gravesend in Kent, at 38.5 degrees C. In that scorching hot summer, over 2,000 people died through the heat, mostly elderly. In France, the same conditions caused 14,800 deaths, again mostly of elderly people.

When such extremes occur, faults, failures and overloads compound and cascade. Air conditioning is an example. Most modern office buildings have air conditioning now, a significant increase from ten or twenty or thirty years ago. Air conditioning units do little work and consume little power when the outside air temperature is below the office environment temperature. But, as the outside temperature goes up and up, they have to do more and more work and then they consume more and more electrical power. When a new air conditioner is installed, there is no obligation on anyone to tell the electricity supply company about it, so no-one knows about the extra load until a really hot day, when the power company’s breakers overload and trip out. An example was the failure in July 2006, which caused Oxford Street to shut down, much to the rage of the retail community. That has a huge effect on companies whose IT equipment is on their premises and who do not have diesel generators and duplication of their cooling systems. If there is enough demand and not enough supply, the electricity company will selectively switch off areas to match the two. Unless you happen to be on the same supply as the Olympic Park when the Olympics are running, that potentially puts everyone at risk.

So what is to be done? There are two main paths to resilience and reliability. One is to build on-site a complete data centre area to modern standards. That means Uninterruptable Power Supply systems , which must be duplicated for reliability, on-site diesel generators, which must also be duplicated for reliability and a complete cooling system, specified to cool the maximum load expected during the lifetime of the equipment must be installed and maintained – and in duplicate so that cooling can continue without interruption during faults or maintenance.

The other is to move the equipment to a professional colocation data centre, where all this is handled daily, where duplicated diesel generators are the norm, and where duplicated cooling and connectivity are everyday matters. That is what most business who have done the analysis are doing and why the data centre industry continues to grow, even during the current economic downturn.

This was posted in Bdaily's Members' News section by Roger Keenan .

Explore these topics

Our Partners