Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With positive remainders you get wired quotient behavior. Why should 10/3 and -10/-3 yield different results? Besides that, the choice is not universal, different languages use different conventions.


Why should 10/3 and -10/-3 yield the same result?

I do not see where this would be of any use.

On the other hand, if you want a quotient that has some meaningful relationship with the ratio between the dividend and the divisor, there are other more sensible definitions of the integer division than the one used in modern programming languages.

You can have either a result that is a floating point number even for integer dividend and divisor, like in Algol, or you can define the division to yield the quotient rounded to even (i.e. with a remainder that does not exceed half of the divisor).

In both cases 10/3 and -10/-3 would yield the same result and I can imagine cases when that would be useful.

For the current definition of the integer division, I do not care whether 10/3 and -10/-3 yield the same result. It does not simplify any algorithm that I am aware of, while having a remainder of a known sign simplifies some problems by eliminating some tests for sign.


I was not really thinking about application but the mathematics. It seems a reasonable decision to me that |a / b| = |a| / |b| and to not get results of different magnitude depending on sign changes only.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: