Font Size

- Aa +

Sun 18 Jan 2009 04:00 AM

Font Size

- Aa +

Back to basics

Getting the fundamentals right when setting up a datacentre is crucial. However, many regional enterprises continue to miss the mark with the process, which demands both knowledge and dedication.

Getting the fundamentals right when setting up a datacentre is extremely crucial. However, many regional enterprises continue to miss the mark with the process, which demands both knowledge and dedication.

It would be difficult to over-stress the need for careful planning before starting on a datacentre. Getting the basics right when setting up a datacentre can help enterprises avoid nearly 70% of the troubles that can come later.

"It is imperative to build it right the first time to avoid costly future changes. Like any infrastructure project, once the foundation is deployed, it is very difficult and often disruptive to make improvements. It is quite important to build the proper datacentre architecture from the beginning to ensure the environment can grow and change with the needs of the business in the least disruptive, lowest risk way. The importance of building it right is specially important in markets where technical talent is costly and scarce," says Mario Blandini, director of product marketing at Brocade.

Traditionally, the cabling has been provided from below the raised floor. This method concealed the cables to keep the room nice and tidy, but brought huge weaknesses to the datacentre. Today, it is widely accepted that the structured cabling should be laid above the racks.

"It is absolutely critical to get the planning stage right when building a datacentre. It is about pounds and pennies. Mistakes at this stage could cost the company a lot of money. It is very crucial to have the right environment, right datacentre and right power and cooling," states Tony Ward, GM for Hitachi Data Systems (HDS), MENA and Turkey.

While most agree that working hard in the initial phases to get the details right is important, most enterprises remain in the dark about where to begin the entire process, and what should be included in this phase.

"Everything from the level of security, the type of flexibility needed, and the tier level of the datacentre should be considered during this phase. Very often, some enterprises try to skip straight to product selection without having a view of their needs. This can only lead to increased costs in the long run," says Gautier Humbert, sales manager Gulf countries for structured cabling products at Ortronics Legrand.

He adds: "Enterprises need to realise that datacentres are a mix of multiple elements - software, hardware, cabling, electrical, environmental, fire protection etc. The only solution for designing the datacentre properly is to have in the team specialists in each domain. Then obtaining all concerned standards will ensure that all are aware of the constraints and limits."

Getting the right team in place to plan the basics of a datacentre is the first step for an enterprise that wants to get it right.

"The ideal team to plan a datacentre within an enterprise should include IT operations, IT planning and external designers. The team should include the architect and have access to civil engineering and the internal facilities team. The typical team should provide a view of the delivery of the datacentre - what and when. Datacentres involve a high initial upfront investment. So customers need to understand the criticality of that - the business decision makers need to understand the criticality of these costs and that they can involve really high return on investment," points out Ward.

"If Sun were to be asked to assist a customer in creating a new datacentre we would look at the following type of skills, real estate agents to find suitable locations, power needs to be available in the right capacity and stability, need for suitable fibre connectivity, locations in stable areas with opportunities for some natural cooling, architects familiar with datacentre requirements, power distribution within the datacentre, some form of independent consultant and project management capabilities. Teams should assess existing systems and plan a move that ensures that some old equipment is removed as part of the relocation process," says Gary Dowzall, service development manager at Sun Microsystems.

Many believe that most enterprises in the region need an external consultant to guide them through the initial planning process, since there continues to be a lack of trained personnel in the Middle East.

"External consultants are essential. Otherwise, they might not get the basics right. These consultants are very experienced in this particular area, they know how to handle it from the IT and facilities side. Leaving it to the IT team alone, which might not have that much of collective experience in datacentres, can prove to be a failure," states Herbert Radlinger, regional GM of Schnabel.

Rolling the ball

Once the team is in place, the next step would involve the selection of the right location and room for the datacentre.

"The room is critical. Too large or too small can lead to disorganisation very quickly. In this part of the world it is also essential that the room has no exposure to the outside environment since the external temperatures in summer will definitely play a role with regard to heat within the datacentre. The ideal scenario is a datacentre surrounded on all sides by corridors and offices. Alternatively, a fully independent facility which is double lined," says Kenneth Neil, team leader, enterprise sales for Middle East, Turkey and Pakistan at APC.

"The location of the room is quite critical. We need to take into account the ease of access, and the level of electromagnetic interference. Let's not forget that the room must have absolutely no windows for security and heat measures, and must be well protected against flooding," adds Ortronics' Humbert.

Most experts also recommend that enterprises choose rooms and locations where there is an increased possibility of natural cooling to reduce potential costs for the organisation.

"Select a space with access to plenty of power, cooling and is free of humidity. Many other factors, including load weight, back-up power generation, and access to telecommunications services are applicable, but the most limiting factors are availability of electricity and cooling capacity to operate the datacentre," says Brocade's Blandini.

In the Middle East, raised flooring has become almost the defacto option. However, while older datacentres were built with a six inch raised floor, newer datacentres have up to 12 inches. Moreover, the way cables are being handled with a raised floor have also significantly changed.

"Traditionally, the cabling has been provided from below the raised floor. This method concealed the cables to keep the room nice and tidy, but brought huge weaknesses to the datacentre: the cables block the airflow inside the raised floor, and they create additional openings in the tiles, lowering efficiency. Furthermore, making changes to the cabling system was near impossible. Today, it is widely accepted that the structured cabling should be laid above the racks, in either cable ladder or wiremesh cable tray. This greatly improves the moves, adds and changes, and no longer impedes the cooling system," points out Humbert.

Power and cooling is arguably the most difficult part of a datacentre, not only to design but also to maintain and add on to at later stages.

"Computing is becoming more and more dense. Computing power and memory requirements demand more power and heat. At the same time, IT staff are demanding space saving servers creating more density and heat. Heat shortens equipment lifecycle within the datacentre. For every 8°C increase above 20°C, the mean time to failure (MTTF) for active equipment is reduced by 50%," explains Tarek Houbballah, senior engineering manager at Cisco in the region.

"Cooling is one of the top two areas of concern for any datacentre manager today. The solution is to make the cooling or heat extraction as predictable and flexible as possible. Placing the air conditioning units as close to the heat generating load as possible makes sense because it reduces the distance that the air has to travel and will also alleviate any problems caused by having obstacles in the way. The room must have a ‘bowling alley' layout with specific hot aisles and cold aisles, and they must use blanking panels within the racks to avoid mixing of hot and cold air," says Neil.

"Effective power and cooling considerations vary from organisational datacentre to organisational datacentre, considering several factors. There are a number of companies where one of the big problems is not that of power distribution, but the actual delivery of power," says Ward.Distribution of power within the datacentre should also be considered and planned for properly in a datacentre.

"Enterprises should ensure that there is enough capacity for growth. They have to consider high density requirements, and make sure of the quality of the power coming in. For cooling efficiency, there should be enough provisioning for air or water cooling. Lot of customers have basic air cooling for most of their datacentre, and prefer water for their high density clusters in the same datacentre. Apart from this, there are other methods of liquid cooling," says Radlinger.

He adds: "The market shows requirements for specific rack cooling, which provides increased capacity for air cooling. Earlier there used to be 5 to 6 kilowatts of capacity within racks, now this can go up to 10 kilowatts. We are finding more of these solutions in the market."

IT staff are demanding space saving servers creating more density and heat. Heat shortens equipment lifecycle within the datacentre. For every 8°C increase above 20°C, the mean time to failure (MTTF) for active equipment is reduced by 50%.

Puddles and pitfalls

The planning process has the capacity to appear deceptively simple. In truth, enterprises can, and often do, make many mistakes, one of the most common being over-specifying or under-specifying power and cooling capacity. To avoid this, most industry players recommend that datacentres are built with different density points within the same room, and power and cooling appropriately assigned.

"It is often time recommended that new datacentres create power, server and storage pods, or areas where these resources are located. In such configurations, IT professionals may find it easier to service and grow datacentre infrastructure. Special attention needs to be paid to air filtering in areas where dust is a potential problem," says Blandini.

"Avoid designs that are built around technologies rather than data, the real asset in the datacentre. Data should be the centre, as the term implies, and above that the applications that use the data to deliver business services," he adds.

Cisco's Houbballah, on the other hand, stresses the importance of planning a datacentre to grow and scale around its networking and storage elements.

"With the levels of changes we are seeing in modern datacentres, it is important to have an open mind with regards to the new design. Many of the traditional methods are no longer appropriate," adds Sun's Dowzall.

Radlinger states, "Many enterprises do not consider the bearing load of floors. They do not plan for enough power either, not understanding the amount of power that can be used by datacentre equipment."

With raised flooring, enterprises need to remember that there is a potential for dust and rubbish to accumulate under the floor, especially if there are cables lining it. Proper maintenance should be in place to ensure that the datacentre works the way it should.

Security should also be taken into consideration, both physical and digital, with the location of the datacentre well defended from potentially malicious intruders.

"Another key element to keep in mind is to prevent any non-integration or potential problem areas between the IT and facilities teams. In order to avoid this, there should be corporate buy-in on the importance of the datacentre, and a corporate hand guiding the combined management of the datacentre by IT and facility teams," says Ward.

Experts also recommend the appointment of one central datacentre manager, who will co-ordinate equally between the IT and facilities teams, but report to a slightly higher level management, to ensure that everything runs smoothly in the datacentre. All of this must, ideally, be planned during the initial stages.

The final word

Though there are no hard and fast rules on how enterprises can plan and deploy their datacentres, there is enough documentation and experience available in the Middle East, and externally, to help any organisation on its path to setting up an efficient and cost-effective datacentre.

The only thing it needs to do is actually pay attention and put in some effort.

Some best practices for datacentres

1. Plug holes in the raised floor

Most raised-floor environments have cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10% of the energy used for datacentre cooling.

2. Install blanking panels

When the panels are used effectively, supply air temperatures are lowered by as much as 22°F (5.6°C), greatly reducing the electricity consumed by fans in the IT equipment, and alleviating hot spots in the datacentre.

3. Co-ordinate CRAC units

Older computer room air-conditioning units (CRACs) operate independently in cooling and dehumidifying the air. These units should be tied together with newer technologies so efforts are co-ordinated.

4. Improve underfloor airflow

Older datacentres typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old datacentres have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.

5. Implement hot aisles and cold aisles

Newer rack layout practices instituted in the past ten years demonstrate that organising rows into hot aisles and cold aisles is better at controlling the flow of air in the datacentre.

6. Install sensors

A small number of individual sensors can be placed in areas where temperature problems are suspected.

7. Implement cold-aisle or hot-aisle containment

Once a datacentre has been organised around hot aisles and cold aisles, dramatically improved separation of cold air supply and hot exhaust air through containment becomes an option.

8. Raise the temperature in the datacentre

Many datacentres are run colder than an efficient standard. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) has increased the top end of allowable supply-side air temperatures from 77°F (25°C) to 80°F (26.7°C). Not all datacentres should be run at the top end of this temperature range, but a step-by-step increase, even to the 75°F (23.9°C) to 76°F (24.4°C) range, would have a beneficial effect on datacentre electrical use.

9. Install variable speed fans and pumps

Traditional CRAC units contain fans that run at a single speed. Emerging best practice suggests variable speed fans be used whenever possible. A reduction of 10% in fan speed yields a reduction in the fan's electrical use of around 27%. A 20% speed reduction yields electrical savings of around 49%.

10. Exploit "free cooling"

"Free cooling" is the general name given to any technique that cools air without the use of chillers or refrigeration units. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.

11. Design datacentres using modular cooling

Traditional raised-floor-perimeter air distribution systems have long been the method used to cool datacentres. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as more energy-efficient.

Source: Gartner

For all the latest tech news from the UAE and Gulf countries, follow us on Twitter and Linkedin, like us on Facebook and subscribe to our YouTube page, which is updated daily.