Hacker News new | past | comments | ask | show | jobs | submit login

Most servers are x86-64 these days. So using 32 bit signed ints barely improves performance at all. Could have gone with a 64 bit unsigned int as it is a positive value all of the time.



A x64 CPU will take a single instruction to operate on a both a 64-bit and 32-bit integer, and the CPU registers are all 64-bit, so in that context you are right that it doesn't matter that much between the datatypes.

However, the physical size of the integer as stored in the CPU cache, RAM, and HDD is still going to be 2x as big for a 64-bit integer. In a hypothetical worst case, you are cutting your CPU cache and memory bandwidth in half, which is tragic.

Additionally, while most physical servers are x64, the OS, server software, and virtualization layer is still often 32-bit -- maybe for legacy, maybe for performance, maybe for a lot of reasons. Upgrading that whole stack up to 64-bits just for the luxury of having default 64-bit integers seems misguided.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: