In a perfect world, a company's primary data center would be in a nondescript suburban location that doesn't attract any notice -- in an office park, perhaps. It would be purpose-built. That is, built specifically to house the company's computer and communications gear. It would be fed by two different power lines working off separate utility grids, with automated failover between them, and there would be plenty of room and
air conditioning to accommodate today's smaller but ultra heat-producing servers.
Also, the secondary or backup data center would be as far away from the primary one as possible, and have its own staff.
Ah, perfection. As anyone who has been around this industry for more than 10 minutes knows, life in IT is a series of compromises. So instead of the ideal, often the corporate data center is on a floor or three of the headquarters building in a big city, wonderful for attracting younger and more recently schooled staffers, but lousy for just about everything else related to IT.
"We encourage clients to get data centers out of headquarter buildings," said Michael Bell, a research vice president at Gartner Inc., in Stamford, Conn. "The space and infrastructure needs for people are different from those for a data center," he explained. Typically, the data center needs a more controlled environment -- think factory, not office -- so the costs for data center space may be 10 times that of a regular office.
"You don't want to burden the office environment with the strict requirements needed for data centers," he said. Plus, security concerns are best met if the data center is in a building without windows and in a location that only staffers know where it is.
Even so, the reality is often a far cry from this scenario. Generally, top-level corporate executives will make a decision about where to move -- across town, across the street or to a different state or city entirely -- based on business factors, including real estate costs, tax and other incentives, people relocation issues and costs. In this situation, IT moves as part of the overall company, but that doesn't always work best.
"We've seen it on a variety of levels, where the corporate executives thought they'd made a sweetheart real estate deal but it will cost millions to make the building work," said Robert McFarlane, president of Interport Financial, a division of Shen Milsom & Wilke Inc. in New York. "You don't necessarily not do the deal, but you have to budget it right, and you might have some work to do in that building that's much more significant than you ever thought about."
To avoid this problem, it's critical to bring IT-savvy architects and designers, as well as internal IT management, into the process as soon as possible, McFarlane said.
That said, there are increasing numbers of companies moving just their data centers -- because they're out of room, or because the heating and cooling requirements of today's servers just can't be met by an older data center.
"The biggest reason a data center moves today is because the newer technologies are pushing the envelope of what can be powered and cooled without special considerations and designs," McFarlane said. "Any data center over five years old wasn't designed with this kind of power and heat load in mind."
Today's servers, including blade and anything 1-u or 2-u in height, are smaller than legacy systems, but they produce much more heat. As older systems are replaced by numerous smaller servers, the data center often suffers from overheating.
In an informal poll at Gartner's most recent Data Center Conference held in December 2004, 30% of the respondents named excessive heat as their biggest facilities problem, Bell said. Another 30% said insufficient power was their top nightmare, with another 21% naming lack of space as their big issue. In contrast, only 3% said costs are a problem.
Out of all the tasks associated with moving a data center, or constructing a new one, planning is the most difficult, experts agree. "Planning involves thinking, and it means getting groups of people together and working through the scenarios: Do we turn this off and carry it, or try to do a clever hot-switch? What if the power fails? What if we drop a server?" said Paul Friday, author of a book called Move IT. Last year, he helped his employer -- a U.K.-based concern he wouldn't name
-- move the primary data center out of a 20-year-old building that he said was crumbling into a brand-new building on the outskirts of town.
In that case, planning was focused on the types of applications and the options for moving them, Friday explained. It involved "lots of work on the network structure" and addressing, and "lots of planning of the backup cycles so that we built up working versions of each application on the new site." And there was time spent with the users of each application, to understand the optimal timing for moving each system.
When it came to the actual move, which happened in stages over six weeks or so, "we paid as many suppliers as possible to move their kit for us," Friday said. There was a communications link between the new and old data center to make moving the data easier.
Paying attention to the details really does matter. Friday recalled one of his war stories, from a long-ago move: "A new office had a small server room in the middle, with windows and a door. The servers were installed first, then the fire and smoke detectors, and finally the glass and the door. The first person to enter the server room found that the door opened inward -- for about two inches." Then the door hit the fire detector on the ceiling. "We had to take the door off to get into the
server room."