Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I once argued for using BigDecimal instead of Doubles in an invoicing software but had a hard time to come up with a practically relevant example.

Is there an example where it makes a noticeable difference (at least one cent in the final result) that does not involve unrealistic amounts or numbers of items?

I'm not arguing for Doubles, just collecting arguments to convince.




I'm curious about this too. If you were engaged in millions of arithmetic operations then I can see how inaccuracies might accumulate in theory, but in practice floating point operations are intentionally designed to minimize that.

And for everyday individual transactions it's hard to see a problem. Maybe the problem is more when you're summing up every single financial transaction for the year? But even in that case, if the smallest financial resolution is a cent, all the floating point noise seems like it would be occurring many decimal points beyond. Even if you're dealing with billions of dollars.

To be clear, I wouldn't do it myself -- I'm too risk-averse, too afraid of unknown unknowns. But it is hard to see what actual real-life negative consequences there would be for 99.9% of businesses, unless I'm missing something? Like the parent commenter, I'm looking for where I'm wrong here.


The part of using float for this that concerns me is not epsilon. It is all of the other weird edge cases & states. I don't like that 2 flavors of infinity and something that isn't a number are explicitly representable.


Technically you can have noticeable differences before you even make it to 1 cent. For example, if you're trying to determine if a result of a calculation is negative, 0, or positive for whatever reason. With floats/doubles, you would probably need to consider "0" to actually be "a number sufficiently close to 0, if not 0 exactly" and then remember to handle that everywhere.

It can also be noticeable if you're just trying to calculate something like "is the invoice paid off". Maybe your view layer is showing $0.00 balance to the end user, but the backend hasn't correctly rounded off those extra bits from a floating point calculation, so your backend logic is now saying the invoice is not actually fully paid off, even if the end user has no idea what they could possibly still owe.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: