Hacker News new | past | comments | ask | show | jobs | submit login

I am surprised that dynamically linking to Rust libstd and other common libraries is not mentioned. Rust produces (as far as Rust code is concerned, not sure about libc if used) fully static binaries by default, right?



1. Operating systems don't ship Rust's stdlib (unlike C's), so such binary would work only on developer's own machine.

2. Rust doesn't want to commit to a stable ABI yet, so even if an OS wanted to ship libstd, it'd be logistically difficult due to libstd version having to exactly match compiler version.


It would still make a lot of sense to dynamically link the particular libstd used for the compilation—and ship it alongside the package—if one wanted to ship a package containing many executables, the way that e.g. ImageMagick works.


I thought that in recent times ImageMagick acted like busybox, with all the executables symlinked to one master one, and dispatching functionality based on name. Certainly that's the way the fork GraphicsMagick works.


1. seems like a chicken and egg problem. I'm sure that the Debian developers wouldn't mind shipping it if it made sense (see 2).

2. is a fair point. At this point in Rust's life I see how avoiding binary conpatibility questions makes sense. Is anybody even working on it, though? Otherwise Rust could be making system-provided, dynamically linked libraries almost impossible due to earlier design decisions without consideration for BC.


> Is anybody even working on it, though?

There is no work on MVPs or RFCs for now, but stable ABIs are being discussed currently: https://internals.rust-lang.org/t/a-stable-modular-abi-for-r...


> Otherwise Rust could be making system-provided, dynamically linked libraries almost impossible

They're very much possible; they're just limited to using the C ABI when interacting with dynamically-linked code. More complex features can be implemented by providing thin wrappers as part of language-specific bindings.


As Rust becomes more prevalent in utilities, routine apps, and workflows using process-based parallelism, there will be a memory advantage to using shared libraries so that stdlib only needs to be resident once instead of once for each process. So long as it is packaged so multiple versions can coexist, this would make sense for packagers.


Rust's dynamic linking story isn't really good because the ABI is highly unstable and Rust makes heavy use of non-erased generics.


Reified Generics?


Yeah that's the formal term. Thank you. Languages like Java have erased generics where you can't even put something of type T onto the stack because the generated code has to stay generic on the size of T. Super annoying to work with.

On the other hand, reified generics increases the binary size. The advantage is that this is more digestible for the optimizer for inlining and specific optimization, which gives you more predictable performance than the alternative of devirtualization, but of course size can also have negative consequences, like for example slower compile time. One good example is this recent PR that improved compile time by making parts of the Vec implementation in the standard library non-generic: https://github.com/rust-lang/rust/pull/72013


I was very much under the impression that Rust used monomorphization, and that there was no concept of generics at runtime (unlike, say the CLR). Am I missing something here?


Rust supports both monomorphized and type-erased implementations, the latter via the `dyn` keyword when instantiating a trait (similar to an interface in Java).


They support both but general rule of thumb I've found from the community is use the prior by default unless you have to use the latter.


Or monomorphized generics.


Not fully static by default, it links dynamically against glibc. If you want a fully static bin you can compile with musl.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: