Hacker News new | past | comments | ask | show | jobs | submit login

Using integer representations for currencies becomes very messy when dealing with more than one currency at a time.

The United States dollar is famously subdivided into cents, but is also subdivided into 'mills' (one thousand to the dollar).[0]

The Mauritanian ouguiya is divided into five khoums.

The Madagascan ariary is divided into five iraimbilanja.

The Maltese scudo is divided into twelve tarì, which are divided into twenty grani, which are divided into six piccioli.

Historically, such currencies were ubiquitous. For example, prior to 15 February 1971, the pound sterling was divided into twenty shillings, each of which was divided into twelve pence, a system that originated with Roman currency and was used throughout the British Empire.

Exchange rates are typically quoted in terms of the largest unit, whereas integer representations of currency would need to be done in terms of the smallest unit, so extensive information about currency structures would need to be used to correctly represent exchange rates. Floating point or binary-coded decimal representations are consequently much better.

[0]: https://en.wikipedia.org/wiki/Mill_(currency)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: