Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Allocate four digits for the year part of the date: a new millenium is coming -- David Martin, Norristown Pennsylvania

2038 is 15 years away right now; this article was published 15 years before 2000.




Even in 1985 you could foresee wanting to reference something after 2000 - terms for a thirty year loan, for example.

Harder to see reasons past 2038 back in 1970.


> Harder to see reasons past 2038 back in 1970

I still find myself confused at this one. Did people in 1970s not imagine that any particular program or database would "survive" for 30+ years? Was the expectation that programs written in the future could set the epoch to a later date?

Now that I think about it, I also find it interesting that the epoch date isn't configurable in any system that I'm aware of.


Unix timestamps can represent dates *before* 1970, by using negative numbers.

Dennis Ritchie has said that he picked signed 32-bit values with an epoch of 1/1/1970 because that would comfortably represent dates exceeding his life - before he was born until after his (expected) death.


Nobody expected Unix timestamp to survive so long when it was designed, and norm was to use different format in databases that would care about dates beyond 2038


And memory was very precious. Why waste bits that would always be zero?


Not just bits - unix timestamp ossified as 32bit value on PDP-11, and handling longer timestamp would involve a lot more code to implement arithmetic on larger than 32bit values, plus also would complicate C language


All computer evolution up to that day happened in the 30 years prior, so how on earth could they possibly expect that their system would last such a long time?

I do recall from the time frame mid-eighthies to mid-nineties that three year old computers were obsolete. Ten year old stuff was ancient history. Never would I have imagined to be able to use Linux for 30 years.


IBM's promises, largely delivered to this day, that their System/360 computer hardware and future generations would be compatible, as was that line of computers except for the lowest end special case? All the effort IBM put into emulation of their older machines in the microcoded models (all but two high end ones)?

Some time before then it was realized we had a "software crisis" as outsiders put it, and IBM realized they were spending too much money supporting many different designs, some evolutionary with minor changes?

Many ideas behind systems like Linux go back to the 1960s as well, see Multics, the primary inspiration for UNIX but ultimately a closed source dead end.

For the original question we have to realize how very small many old computers were. UNIX started on small DEC minicomputers, it would have made almost no sense at all back then to allocate more than 32 bits when your maximum data address space was 56KiB and your CPU was 16 bit (the PDP-11 family where UNIX became big following an incompatible DEC predecessor).


It's a worthy exercise to code for a while on a vintage machine with a hundred or so kilowords memory. Puts things into perspective.


Y10K bug! Only 4 digit dates.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: