Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Epochs don't tell you what time zone you're working with and they aren't very easy to read/debug at a glance.


Fair enough @pimlottc. But most of the time I am far more concerned with accurately capturing a moment in time than I am with making it instantly readable. I also have helped some companies that ran into serious datetime management issues when they worked with strings that led engineers to assume a certain timezone, others to assume others, and chaos ensued.

Epochs may not be instantly readable, but they do force everyone onto the same page.

In any case, I tend to view a timezone as a separate piece of data than the actual moment in time (but I know others have a different paradigm): datetime = accurate moment in time; timezone = the timezone for which this should be viewed, or was captured or, etc.


So by epoch, obviously you mean the number of seconds since Jan 1 1970. Midnight - 00:00, right? Ah... but then... UTC? or TAI? There's a 35 second difference, after all. There's a school of thought that the UNIX epoch counts from 1970-01-01 00:00:10 TAI...


POSIX specifies the Epoch to be UTC and has since at least 2001. People may have other opinions on how it should be specified, but if you're going to follow POSIX as it exists, you're not left with a choice in the matter.

http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_...


But POSIX also claims there are 86400 seconds in a day, which is not always true for UTC. There are two ways of dealing with that - POSIX says that the correct way is to just count seconds, but reset every UTC midnight to (number of days since 1970-1-1) * 86400, which means that when leap seconds occur some epoch numbers are ambiguous (or in a leap-second-deletion, are skipped). NTP ignores POSIX and says that the way to deal with this is to vary the length of a second during a day which contains a leap second.

And we're talking about JSON here, so isn't the ECMA-262 standard for dates more relevant than the POSIX standard? ECMAScript has some very fuzzy ideas about dates.


POSIX specifies that there are 86400 seconds in a day. POSIX is not making claims about reality, it is specifying its own reality. That's what standards do.

ECMA-262 isn't really relevant at all, since it's not the (or even an) authority on JSON. JSON was simply derived from it -- in an incompatible way at that. It's doubly irrelevant since you were talking about Unix, so that's what I was addressing.

By the way, your phrasing is odd/confusing. You're talking about "epochs" in a strange way. In Unix/POSIX land, there is one epoch, "The time zero hours, zero minutes, zero seconds, on January 1, 1970 Coordinated Universal Time (UTC).". Unix timestamps are derived from the epoch, they do not define it.


I had not thought of that. Ha!

I would go for UTC, since that is what 99.99% of the people think of (and so few even know about TAI, except for the smart ones like @jameshart! :-) )


Another annoyance is seconds since epoch (traditional Unix) vs milliseconds since epoch (e.g. Java).


And JavaScript is milliseconds as well. I think either would be fine, as long as it is an agreed standard. Personally, I use ms, because most programming langs can instantly convert with any additional math. But it probably is wasteful. Then again, 1000 is just a few bits on each...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: