Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article is wrong or misleading.

POSIX (and ISO) time_t is not supposed to "see" the additional leap second at all. POSIX time_t is defined to effectively always have exactly 86400 seconds per day, and no fractional parts. The seconds as defined by POSIX then can't last exactly as long as the atomic seconds. Even on the days where a leap second occur.

Wikipedia article confirms:

"Every day is treated as if it contains exactly 86400 seconds"

But that the seconds don't last the same and therefore aren't "the same" like the "atom clock" seconds should not matter for the normal users.

The graphs in the article with the fractions of the second going backwards are just poor implementations in some specific operating systems, libraries or programs. It's not something that POSIX standard prescribes that is supposed to happen.

The confusion of the common programmers, like the writer of the article or those who implemented the "backwards" behavior comes from them not understanding what they work with. Most of the users of most of the computers don't have atomic clock. So they also can't count atomic clock seconds. What the "normal" computers have are clocks which are much less precise. It's exactly for that kind of use the time_t is designed by POSIX to have exactly 86400 seconds per day -- the absolute error is at most one "atomic" second per c.a. half a year, but the error of all of the clocks directly available to the normal users in their normal computers is bigger.

So "normal" programs which do common human-related scheduling should not even try to care about the leap second. Use something like a "smeared" time as a reference:

https://developers.google.com/time/smear

The SI seconds in that article are the "real atomic clock seconds" -- but caring about them isn't even needed for normal human related computing tasks. If you have a real atomic clock, by all means synchronize it with other atomic clocks. If you have a normal computer, use the smeared time. There will be no "jumps" at all then.

Leave the leap second to the astronomers and others who are doing the "hard" time tasks, they have to care, and they have their own software for that.



I really don't think POSIX intends seconds to vary in size. And while it's hard to measure the difference over a year with a typical clock, it's very easy to measure the skew over a single day. Skewing over a day is not something for only atomic clocks to worry about. And skewing over more than a day would imply that you want to get dates wrong, which is stretching the language quite a bit...


Agree. In other words, computers should use UT1 (where one day is a rotation of the earth, and 86400 seconds; and consequently a second is not a SI second).


Correct, time_t second is already in practice not an "SI second" the later being defined by the "counts" in the "atomic" clocks:

"the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom" (at a temperature of 0 K)" https://en.wikipedia.org/wiki/Second

That SI second is what I refer to when I mention an "atomic clock second."

The time_t second is effectively simply one 86400th of a day.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: