Hacker News new | past | comments | ask | show | jobs | submit login

My point is that it is not an unrealistic case. In a security context, it is perfectly realistic.



I agree that the case is perfectly realistic.

> How does rust protect against such integer overflow caused by multiplication?

To me, the question was whether or not Rust would have been able to natively protect against this, considering it was a runtime issue?


To be fair, I would probably also run my security test suite against a debug binary too if I knew this kind of thing could be caught.


In Rust overflows generally will cause a panic. So "2 + 2" will return 4 or panic if you're on a 2-bit system (they provide .saturating_add() and .wrapping_add() to get code that will never panic)

Thus, you could have caused a denial of service by crashing a rust-based Curl, but the crash would have been modeled and just uncaught.


To be clear, Rust only panics from integer overflow when in debug mode. In release mode it will silently overflow just as happily as C.


To be extra extra extra clear:

1. Rust specifies that if overflow happens, it is a "program error", but it is well-defined as two's compliment overflow.

2. Rust specifies that in debug builds, overflow must be checked, and panic if it happens.

In the future, if overflow checking ever has acceptable overhead, this allows us to say that it must always be checked. But for now, you will get a well-formed result.


I'm not a Rust user (yet), but I'm a little surprised that with its emphasis on safety Rust doesn't maintain all debug safety checks in production by default. You could then do some profiling and turn off only those few (or one or none) that actually turned out to provide enough real-world, provable benefit to be worth turning off in this specific piece of code.

Since you would turn them off one at a time explicitly, rather than having a whole set of them disappear implicitly, you would probably also tend to have a policy of requiring a special test suite to really push the limits of any specific safety issue before you would allow yourself to turn that one off.

Obviously, if this occurred to me at first glance, it occurred to the designers, who decided to do it the other way after careful consideration, so I'm just asking why.


Basically, overflow doesn't lead to memory safety in isolation. That's the key of it. The worst that can happen is a logic errors, and we are not trying to stop all of those with the compiler :) Justifying a 20%-100% speed hit (that was cited in the thread) for every integer operation to save something that can't introduce memory safety is a cost we can't afford to pay.

If you want the full details, https://github.com/rust-lang/rfcs/blob/1f5d3a9512ba08390a222... is the RFC, and https://github.com/rust-lang/rfcs/pull/560 was the associated discussion, there is a lot of it.

EDIT: oh, one more thing that may have significance you may or may not have picked up: one reason why under/overflow in C and C++ is dangerous is that certain kinds are undefined behavior. It's well-defined in Rust. Just doing that alone helps, since the optimizer isn't gonna run off and do something unexpected.


Modern C compilers also allow to control the integer overflow behavior. For example, with gcc -ftrapv you can make the program to abort on overflow.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: