Those are very different scenarios, they aren't equivalent.
Leap years have to do directly with the sun, what day and time the equinoxes happen every year and the like. This time noticeably drifts every year, even every day, computers or not.
These leap seconds and what not have to do with very sensitive instruments measuring time very precisely. These time measures are entirely about machines.
That's simply not true at all, and that was my whole point. Leap seconds have nothing to do with precise measurement because they are nothing more than calendar adjustments, just like leap days. They have the same purpose too; leap seconds keep sunrise and sunset in the right place over long periods of time just like leap years keep equinox and solstice in the right place. Note that leap years, an invention millennia old, account for drift on millennial scales, with the 400 year rule for example. This disproves the theory that there is some modern obsession with exactitude which sets leap years and leap seconds apart, because leap seconds too deal with significant drift on the same scales.
Only the madness of "smearing seconds" and other workarounds for broken software with incorrect calendar implementations makes them seem different.
Leap years have to do directly with the sun, what day and time the equinoxes happen every year and the like. This time noticeably drifts every year, even every day, computers or not.
These leap seconds and what not have to do with very sensitive instruments measuring time very precisely. These time measures are entirely about machines.