Local timezones on servers considered harmful? [closed]
I'm curious as to what other administrators' experiences with timezones, in the context of remotely administered servers, are. In my career I've come across several conventions;
- Always, always, always use UTC.
- Always, always, always use the timezone of wherever base HQ is.
- Use the local time of the people who happen to be administering.
- Use the local time of the server location.
In some places, I've come across multiple and conflicting conventions. My own preference has been to use UTC, always - with no daylight savings. But for one reason or another, it seems that most people prefer to use some concept of local time, with daylight savings. Although it seems like a straightforward technical matter, discussions around changing conventions always seem to trend towards religious schisms.
What are you using? What do you consider to be the advantages and disadvantages of each approach?
Solution 1:
- Hardware clock should always be UTC. Always.
- Timezone as a setting can be whatever is convenient. Usually. Sometimes it should also be UTC.
Some reasons why UTC is nice:
- Daylight savings rules change, and updates do not always happen in a timely fashion. UTC makes this go away.
- When logs from servers in different location need to be compared, UTC makes a great common standard.
- Typically when servers are in different locations, either people or applications or both must deal with time conversions while performing, say, database inserts. If you have a single conversion (to UTC) then it's much easier to get right than if you must convert from one TZ to another, varying by server, TZ.
Solution 2:
I prefer option 4. It's the responsibility of applications running on a server to decide whether to store DateTime values in UTC or not.
Also, when a server records system event logs, it's nice to be able to correlate local events with log entries. For example, if a data center reports a network disruption in local time, you can easily identify any issues that occurred without having to convert time values in your head.
Solution 3:
No, no, a thousand times no.
There are two types of programmers... Those who understand that localtime should be used for display/formatting purposes only, and those that are painting themselves into a corner ... and they are painting with kerosene.
All events should be recorded in UTC, and the results converted to localtime only to display them to users. Damned are those who fail to do this, and doubly-damned are those who use localtime in a format that discards the timezone information (I'm looking at you, Oracle DBAs).
Think of it like a taint check... If you convert a timespec to localtime and then do anything with it that isn't emitting it to STDOUT, your program should not only exit with a fatal error, but also delete your source to teach you a lesson.