What were your server room pain points, big wins and must haves? [closed]

Like many people on here I suspect, our server room has evolved over time.

At the beginning we had a single server running Microsoft Small Business Server 2000 and it sat in the corner plugged in and happily did all our emailing and firewalling.

Over time we have deployed a few more servers and they all sat on a shelf until we took the leap to using a few rack mounted servers (why do we need those when they are so expensive, asked the bean counters!) in a little mini soundproofed cabinet.

Fast forward a few years and now we have 2 racks, with lots of storage, switches, servers etc. and are about to move them all to a brand new rack in the room and I was wondering if there are any words of wisdom e.g. do we need to earth the rack? Opinion here is divided as some think that simply by having PDUs screwed on to the rack, it is therefore earthed.

I know that proper full on server rooms will have all sorts of fancy stuff like environmental monitoring, PDUs with power draw displays etc. and I'm not looking for a "best practice" answer about how to run a datacentre as that would be beyond the scope of serverfault.

However, I suspect there are quite a few sysadmins who's server rooms have evolved and so I'd be interested to hear the main pain points people have experienced and the "big win" and "must have" features and equipment.

My personal ones are

  • get cabinets 800mm wide rather than 600mm so you have space down the side for cables
  • use cable management arms to avoid spaghetti wiring
  • colour coded network cables can be good to see things at a glance
  • get a monitor/keyboard/mouse tray

EDIT:I removed "get rid of side panels and front/rear doors unless you really need the security" from the list above after the answer below saying that they are needed for good air flow.


Solution 1:

This question is primarily opinion based, IMO. There undoubtedly are some best practices and standard guidelines for some things but much of this boils down to "what do you like?". Having said that, I'll keep my answer brief. As to your points:

get cabinets 800mm wide rather than 600mm so you have space down the side for cables

More room for cables is always good.

use cable management arms to avoid spaghetti wiring

I've never been a fan of cable management arms. I think they get in the way more than they help. You can achieve a clean, organized cable layout without the use of them. It takes planning and work but it can be done.

colour coded network cables can be good to see things at a glance

Sounds like a good idea. We do this. It makes it easy to know at a glance whether an interface is plugged into a back end switch, front end switch, load balancer, switch to switch connection, etc. Just don't go crazy. Having 15 different colors for cables will quickly defeat the benefit. Stick to as few colors as possible to convey the information you need to convey.

get rid of side panels and front/rear doors unless you really need the security

I disagree. Air flow through rack mounted hardware is engineered for cabinets that have sides. Taking the sides off will probably mess up your cool aisle/hot aisle air flow. Unless you have a very compelling reason/need to take the sides off my advice is to leave them on.

get a monitor/keyboard/mouse tray

If your servers have integrated out of band management solutions (iLO, iDRAC, etc.) you might be able to get by without a KVM. If you do need a KVM my advice is to get an IP based KVM that will give you access to the KVM over the network.


One more thing; if you can afford it look into using managed PDU's. They're invaluable for power cycling a hung piece of equipment over the network, possibly saving you a trip to the server room at 3AM (or whenever).

Solution 2:

There are different views on about every bullet point.

  • You may also mount your equipment so you don't need to wire cables along the side of the cabinet. For example, network switches may be mounted "in the back" of the cabinet, so you don't need that long cables anymore and any cabling will be done right in the back. This also reduces your need for wider cabinets.
  • Use cable management arms only if they actually do provide a real benefit. If you only do connect one or two power cords along with one or two ethernet cables, there's not that much cabling to manage. Some cable management arms tend to eat cable, are inflexible and don't provide additional benefits other than "looks clean, but is a pain to handle".
  • Do label your cables on both sides of the cable. At least, do label your network cables, but the same is true for all other cables as well (e.g. power cords). Usually, you're looking for "the other end" of some cable, and labeling them BEFORE plugging them in does assist doing so. Having to look for "one of the 20 yellow cables" or slowly pulling on some cable ("hey John, which cable does move") is not that much of an improvement.
  • Install Side Panels and front/rear doors, if your aircon/hvac-system is built according to that idea. Not only "real" data centers use raised floors, but there are also aircon systems which may be installed on top of a rack and require a closed rack for efficient operations.
  • Use a monitor/keyboard/mouse tray, if you really require physical console access and only do use very few racks. Some monitory/keyboard/mouse switches tend to get stuck, interfere with some keyboard layouts or require special, very expensive cabling, so I do want to avoid them after all. Another option for physical console access is to mount a small LCD monitor along with a small USB keyboard on a small trolley or movable desk. Many recent servers do offer both a front and a back VGA connector, so you just move your trolley or desk to that rack, attach the power cord to an open socket or pre-wired cable, you're done. Of course, you may also install a monitor/keyboard tray and run the cable to the front side, so you may attach them to any box without remembering weird keyboard combinations.

  • I do recommend setting up remote management interfaces (IPMI, HP iLO, Dell DRAC, IBM ASM, serial consoles, ...). Even if you're using them as a replacement for "serial console"-like access and not e.g. HP "Advanced License"-stuff (VGA-redirection), that's something which in the long run will ensure you don't have to "go to the office" during the weekends for monitoring a reboot sequence or powercycling a stuck box.

  • Another, very serious recommendation is the book "the practice of system and network administration" and the author's blog http://everythingsysadmin.com/. Tom Limoncelli and Christine Hogan did gather A LOT of really "essentials" you probably never thought about it.
  • After reading "the book", you may also try visiting http://www.opsreportcard.com/ and check how many questions you can honestly answer with a clear "yes, I do". Everything else is either a strong improvement or may not be necessary at your specific site.