But it's correct. It's "a" count. Just not the count that you might always expect. And the "second" in this definition means what people usually understand as a second, as in the duration is always the same. That's all, and it's pretty useful imho.
> And the "second" in this definition means what people usually understand as a second, as in the duration is always the same.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.
Indeed, it's the opposite of what I thought. Confusing! Thanks for the clarification.
So in fact, unix seconds can be longer than intuitively expected. Which also means two timestamps of e.g. UTC with different seconds can map to the same unix timestamps.