Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Currently software has to be built to accommodate leap seconds. They happen frequently enough that you'll find out within a few years whether your software breaks when time suddenly skips forward or backward.

If we kick the can down the road such that eventually we'll need to add a leap minute, we're going to end up with software that was never written to expect time to change in such a way, hasn't had a real world test of the change for decades, and will have no one working on the software who ever had to deal with such a change.

It's going to be much worse for software reliability to have a leap minute on the order of once a century than a leap second every few years.




Why would we ever need a leap minute? Just let time zones drift. Unless you happen to be at the exact longitude for which your time zone is correct, the sun's not at its highest point in the sky for you at exactly noon anyway.

Eventually, thousands of years from now when time has drifted by an hour or more (assuming modern technological civilization even still exists by then), each jurisdiction can just change their time zone's offset from UTC, without coordinating with anyone else. Jurisdictions making time zone changes is a well-understood situation that we already know how to deal with.


The last part is right and is also why we have this problem.

Leap seconds are an architectural blunder that always belonged in the abstraction layer that lines up the sun with the rotation of earth (the time zone abstraction). It never belonged in the part that counts seconds.


That's a great way of putting it!


This is definitely the most interesting point in this whole comment chain.


You're right, but I'd argue this problem is already here.

Thanks to glaciers melting, earth rotation is (temporarily) accelerating. Because of that, positive leap seconds, regular before, didn't happen since 2017 - so there could very well be (recent) software out there that has that code-path broken, and nobody noticed yet.

And due to exact same geophysical effect we might see a negative leap second - something that never ever happened before. What are the odds that every single piece of software gets that one right?


Interesting point. Just like a skater pulling their arms in, snow melting and running to lower ground would make the earth spin faster (and let's add in the extra erosion). Sea levels are rising though (due to both thermal expansion and runoff).

Be interesting to estimate the size of the various effects (no doubt I've missed plenty of others) but is it really true that a change in sign of the acceleration of Earth's angular velocity is down to climate change?


I don’t think that’s right. I would think the effect of melt of glaciers on mountains would be negligible and the effect of the poles melting and sending that water down to the poles would slow down rotation, like an ice skater pushing their arms out.


All the ice isn't /exactly/ at the poles, but well down the side as well. Also, ice is less dense than water.


The rate of drift is so slow that people will never care.

Even over thousands of years when an hour of drift is accumulated there won’t be a manual adjustment - people will have just gotten used to different times of day having sunlight, with generations having been born and died with mean solar time happening at 11am.

Eventually the rotation of the earth may change enough that drift accumulates too quickly and leap time needs to be added, but that’s only going to be true thousands to tens of thousands of years in the future.


I've always thought that the best way would just have leap seconds every six months, and have it be ± 1 second no matter what - so sometimes you'd go back a second twice in a row, and then go forward a second to get back to correct.

That'd test all the software paths.


> That'd test all the software paths.

The daylight savings time bugs I've run into at pretty much every company I've ever worked at would beg to differ.


See also leap year bugs even though that concept has been around even longer than DST.


My favorite is comparing aggregated data across several time zones when DST changes happen.


Shouldn't you just aggregate based on UTC? Leap seconds still cause a problem here but daylight savings shouldn't matter.


People expect e.g. their "daily power consumption" to end at midnight local time. When they zoom in on the daily aggregates, they don't want a spike to suddenly disappear just because it belonged to the next local day and you only stored UTC-day aggregates. Even more so for billing.


The problem is that if the originating system didn't record the time in utc, you've lost the information needed to make the time unambiguous. What's 2022-11-06 01:30 USA/San Francisco in UTC?


How does that solve comparing days with 23 to 25 hours of data when aggregated by hour? Or are you suggesting using days in UTC?


Making time less accurate just for the sake of testing in production doesn't seem great.


From 2035 it's going to be less accurate in exactly the same way. I'd argue bombcar's method is more robust.

I don't know of anything that actually requires UTC to be within 1 second of average rotation-based time; having it within 2 or 3 seconds is extremely unlikely to actually break anything. But we do generally want to have measured time roughly in line with Earth time in the medium to long term.


Today it is within I think +-3 hours of earth time (China being the worst offender), because of timezones and DST. If it ever gets worse than half an hour in a country, they can always change the timezone... or just, like, change businesses' opening hours and stuff.


We have summer and winter time because that turns out to be easier than changing business hours. But I was just talking about UTC really.


How about clocks on Mars? Should these have Earth leap seconds? Time synchronization is already hard enough, complicating it with adjustments based on uneven planetary body movements doesn’t make much sense in a long term.


In the longer term Mars should probably have its own time scale. When we get to further colonization maybe we can use TAI for interplanetary use (but time dilation due to relativity effects is going to be a problem when going to other stars) while still using UTC on Earth.


Better make it every other day rather than every six months.


Henceforth every minute will be "bell curve distribution around 60" seconds long, and each hour will be "bell curve distribution around 60" minutes long.

This variability will provide excellent fuzzing of all time libraries everywhere.


It would test all the software paths in production though. Bad software would still fail and need last minute patches every 6 months.


Nobody really cares about clocks being celestially "off" by a minute either.

So this isn't a once-a-century thing, it's adding-a-leap-15-minutes-once-a-millenia issue.


https://royalsocietypublishing.org/doi/10.1098/rspa.2016.040...

Odd. I was wondering how fast it actually was long term, and this rate from historical record seems much lower. They cite 1.8ms/century if I'm reading it correctly with some odd cyclical thing going on. " the change in the length of the mean solar day (lod) increases at an average rate of +1.8 ms per century. "

I mean, we've added 22 seconds over 50 years. Although at current rate it would still just be 7 minutes after a millenia :)

edit You know, nevermind, that's all covered on wikipedia. https://en.wikipedia.org/wiki/Leap_second#Slowing_rotation_o...


The rate accelerates long term, but yeah.


> Currently software has to be built to accommodate leap seconds

Plenty is not.

The Swift time type ignores them in its implementation. I filed an issue, and they said there were no plans to implement them.

Good choice it turns out.

Who would of thought that adding a second on random new year changeovers would be worse than letting clocks drift.

Me for one.

We can have a "leap hour" in a thousand years. Till then do we care, I do not, if the clocks and the sun drift very slowly apart from each other?


In an enterprise environment, every leap second or TZ change invokes a ceremonious updating of anything using Java or a JVM. I suspect a lot of people would rather perform a once a century update than the bi-annual process they do now.


The thing is that there will be entire Java-like ecosystems that will have shorter longevity than the time between leaps. So practically all of them will have no facility for a leap at all.


Did we just create a Cobolesque Y2K situation for Java?

I can just imagine Graybeards of the future rushing ahead of the leap minute to update the JVM lest the world goes in flames yet again.


Indeed; the converse would be to add (or remove) micro-leapseconds in order to keep accurate time. That way it would happen so often that the code would get tested and we'd have working implementations (maybe we could fix the traditional Apple issues with leap years at the same time).


I’m pretty sure we would get bugs whenever the accumulated micro-leapseconds would reach a full second.


Mmm...true, some programmers even have problems with gettimeofday().

The point still stands though - common problems have tested code, uncommon/rare problems rarely have battle-hardened solutions.


It's going to be centuries until the difference between astronomical midnight and UTC midnight is more than an hour, which is the exact same amount of error we deliberately inflict on ourselves to have more daylight in summer evenings and half the error that residents of Spain experience by as a consequence of the political decision to join the same time zone as Berlin. If human civilization exists long enough for this to be an actual problem, we're probably going to have to figure out how to coordinate time with multiple space settlements, which is going to be a much harder problem thanks to time dilation.


Why bother? A calendar year is 365.242 days long... time is already off by about 3/4 of a day the year before a leap year, what does adding a second here or there really do?


Time is not off by about 3/4 of a day, calendar is. And hardly many people care about planet's position relative to the Sun (on the orbit) — I mean, they can, if it is important for they occult rituals or something, but probably they learned to work with it over the centuries already.

Earth's rotation relative to the Sun is whole another deal.


Well actually people did, or we'd still be on the Julian calendar.


clocks get used for two distinct purposes, often at odds:

- the measurement of durations.

- the presentation of some timestamp in a way that the reader has some intuition for.

that first purpose won’t be hurt by not tracking leap seconds. actually, a lot of applications will probably more accurately measure durations by eliminating leap seconds.

if leap seconds (or minutes) really are of critical importance, we’ll reintroduce them to the presentation layer. the thing is, very few people can tell the difference between 12:01 and 12:02 without being told the “real” time. so if you’re presenting a time which is “off” by a minute because there’s no leap seconds… does it really matter?


There should really be three "layers" of time indirection.

1) Seconds since 00:00:00UTC 1.1.1970. This value increases by 1 each atomic second and never goes forward/back. Call this Universal Monotonic Time.

2) The difference between when the sun is at its zenith at Greenwich and 12:00 UMT. Call this the Astronomic Drift.

3) The timezone - offset from Greenwich that makes the local clock sync up with astronomic time and also contains a DST offset if that location observes DST at that date.

By adding up 1) + 2) + 3) you end up with the "human time" at a given location at a given date.

A computer system should only ever store 1). Then, it can calculate human time when displaying it to humans.

I'm also a fan of having "local sun time" which would be the time according to the position of the sun in the sky, quantised to 15-minute slices (basically micro-timezones). It would be nice if office hours, school times, &c can be defined based on that, i.e. work starts at 9am local sun time, which will sync up better with people's biological clock and cut down on the yearly stress DST causes.


I agree with your separation between "duration" and "absolute point in time". But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time. You could get over this on your local machine with a local counter, but across network boundaries you need to rely on absolute differences.


> But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time.

other way around. e.g. unix time is determined by the number of seconds (as experienced by some point fixed to Earth, roughly) relative to some reference point. duration is the native format for most time systems, like UTC, because “absolute time” isn’t a thing that can be measured. faking the duration (e.g. adding leap seconds) within a time system is sort of nonsensical: we only do it to make translation across time systems (UTC, UT1, local/human timezones) simple. if that’s the justification for things like leap seconds, then better to keep the time system itself in its most native format (assuming that format is useful locally) and explicitly do those conversions only when translating, i.e. at the network boundary: when communicating with a system that doesn’t share/understand our time system(s).


I regret that the reality is such, but Unix timestamp are in sync with UTC so they too have gaps and reapeats (for positive and negative leap seconds)

The international standard for monotonic time is TAI, which never had leap seconds, but which is also used by almost no one


There has never been a negative leap second.

>If we kick the can down the road such that eventually we'll need to add a leap minute

The drift is not in a single direction. The total drift is not going to be significant for a very long time if ever.


The total drift is accelerating long term, because the tidal deceleration of Earth‘s rotation due to the moon is quadratic.


It will be an urgent crises in a century, a millennium...


> There has never been a negative leap second.

That doesn't stop the (e.g.) FreeBSD folks run tests to make sure things are fine:

* https://lists.freebsd.org/pipermail/freebsd-stable/2020-Nove...

* https://docs.freebsd.org/en/articles/leap-seconds/

Of course there's a whole lot of other userland code besides ntpd.


Nah, you just smear time faster or slower so you never need to "leap".




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: