I still find myself confused at this one. Did people in 1970s not imagine that any particular program or database would "survive" for 30+ years? Was the expectation that programs written in the future could set the epoch to a later date?
Now that I think about it, I also find it interesting that the epoch date isn't configurable in any system that I'm aware of.
Unix timestamps can represent dates *before* 1970, by using negative numbers.
Dennis Ritchie has said that he picked signed 32-bit values with an epoch of 1/1/1970 because that would comfortably represent dates exceeding his life - before he was born until after his (expected) death.
Nobody expected Unix timestamp to survive so long when it was designed, and norm was to use different format in databases that would care about dates beyond 2038
Not just bits - unix timestamp ossified as 32bit value on PDP-11, and handling longer timestamp would involve a lot more code to implement arithmetic on larger than 32bit values, plus also would complicate C language
All computer evolution up to that day happened in the 30 years prior, so how on earth could they possibly expect that their system would last such a long time?
I do recall from the time frame mid-eighthies to mid-nineties that three year old computers were obsolete. Ten year old stuff was ancient history. Never would I have imagined to be able to use Linux for 30 years.
IBM's promises, largely delivered to this day, that their System/360 computer hardware and future generations would be compatible, as was that line of computers except for the lowest end special case? All the effort IBM put into emulation of their older machines in the microcoded models (all but two high end ones)?
Some time before then it was realized we had a "software crisis" as outsiders put it, and IBM realized they were spending too much money supporting many different designs, some evolutionary with minor changes?
Many ideas behind systems like Linux go back to the 1960s as well, see Multics, the primary inspiration for UNIX but ultimately a closed source dead end.
For the original question we have to realize how very small many old computers were. UNIX started on small DEC minicomputers, it would have made almost no sense at all back then to allocate more than 32 bits when your maximum data address space was 56KiB and your CPU was 16 bit (the PDP-11 family where UNIX became big following an incompatible DEC predecessor).
2038 is 15 years away right now; this article was published 15 years before 2000.