When either selecting a data center as a co-location facility or designing a new one from scratch, what would your ideal specification be?

Fundamentally, diversified power sources, multiple ISPs, redundant generators, UPS, cooling, and physical security are all desireable.

What are the additional key requirements that someone might not consider on the first pass?

What are the functional details someone might not consider during the initial high level design?

I'd like to approach this from the perspective of designing a large data center or seeking a facility that was designed perfectly from an infrastructure perspective. This question has already been addressed with smaller facilities and workspace considerations here.


Solution 1:

I've been lucky enough to have built a few over the years, certainly I'd look to the following points;

  • I'd always go for multiple sites, even if that meant having smaller sites in total.
  • You're right about multiple diversely routed/sourced power supplies and good UPSs, physical security etc.
  • I won't be buying any more AC units for my data centres again, choosing to partially filter the ambient air and use semi-sealed extraction tunnels/pipes/channels to pull the hotter air out - possibly with some form of heat exchange to recoup some of the energy from the heated air. This approach saves a fortune, is 'greener', can support higher watts/rack and much more reliable/available.
  • I'd use solid concrete floor, not raised flooring, this will obviously support higher loads (i.e. fuller racks) with overhead caging carrying mostly OM3 fibre and a few cat5/53/6 coppers for where they're absolutely required.
  • I'd go for fewer, faster, trunked but resilient links to my servers/switches/blades etc. than the old-school waterfall of lower-speed links.
  • With the cost of disk-based CCTV solutions getting cheaper and cheaper I'd cover every row or position in the place and record everything.
  • Every site needs two non-equipment areas - an area in the server room that's fenced off from the racks with a desk and storage for kit and tools, a chair, power, networking etc. - and a second area outside the server room to make calls and get away from the noise.

I hope this is of help, I might add some more later.

Solution 2:

One thing that I don't see already posted is the budget to be able to able to build a very good team of people.

I recently went cage shopping and found that they pretty much all were peered with multiple tier-1 providers, multiple diesel generators, etc.

What made me pick the one I did was that everyone there was sharp and dedicated, there were plenty of people on location, the sales managers and projects managers were also great. All the generators and peerings in the world won't help if the guys plug you all into the same generator, or the remote hands don't respond when you really need them to.

So this may not fall under infrastructure, but in the end it can be more impressive than four vs two redundant generators, 2 vs 3 Internet peers, etc.