Metric time is one of those ideas I just naturally think about from time to time. Time is such a strange system: there are 60 seconds in a minute, 60 minutes in an hour, 24 hours in a day, 365 days in (most) years, but not every fourth, but yes if it’s a multiple of 100, but not if it’s a multiple of 400. Both days and years are astronomically defensible, even though they aren’t easy to reason about. But just about everything else about time is arbitrary.
And don’t even get this programmer started about time zones and, even worse, daylight savings time. Both things just took a system that was kind of comprehensible, if you squinted hard enough, and made them utterly incomprehensible or hard-to-systematize. Especially with all the different ways that those two things are handled between countries and areas within them, it’s just a mess.
The Unix Epoch
To try to add some sense of order to this, and to let computers deal with times in a way they’re better adapted to, a standard was created early in the life of computers. That standard is, as named by Wikipedia:
the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday, 1 January 1970, not counting leap seconds.
The specifics of that definition aren’t trivial, but already you probably have a sense of some of the advantages of this. First, all times after 1969 can be expressed as positive integers (whole numbers), since a known and fixed point in time. We don’t need to sweat too hard the details of lunar or solar cycles, we can just count seconds from a known value. These times are inherently less recognizable to humans, my birthday next year will range between 1391126400 and 1391212799 on the Unix timeline. You do get some of my trademarked “Useless Points” if you figure out my birthday from that, but it’s obviously not easy to tell that’s a calendar day. But it is. And a computer can dependably and easily tell us when we’re in that day because integers are easy to reason about.
The Hard Part
Obviously time’s still not easy. Dealing with all these conversions between Unix time and human time isn’t easy either. For this reason, almost all programming languages have a function or class made to do this. Some of them are really good, and some are pretty hard to use. But all of them are meant to deal with the fuzzy issue of time, and all of them depend on and refer to the idea of Unix time. Understanding it is useful for any programming that relates to time, in any language.
Image Credits: alancleaver