This is just a modern problem in all software development, regardless of language. We are doing more complex things, we have a much bigger library of existing code to draw from and there are many reasons to use it. Ultimately a dependency is untrusted code, and there's a long road to go in hardening entire systems to make running arbitrary dependencies safe (if its even possible).
In the absence of a technical solution, all others basically involve someone else having to audit and constantly maintain all that code and social/legal systems of trust. If it was pulled into Rust stdlib, that team would be stuck handling it, and making changes to any of that code becomes more difficult.
I'd argue that the severity varies between languages, despite the core problem being universal. Languages with comprehensive standard libraries have an advantage over those with minimal built-in functionality, where people rely on external dependencies even for the most basic things (e.g. see Java/.NET vs JS/Node). Lightweight is not always better.
> Languages with comprehensive standard libraries have an advantage
I don't see the advantage. Just a different axis of disadvantage. Take python for example. It has a crazy big standard library full of stuff I will never use. Some people want C++ to go in that direction too -- even though developers are fully capable of rolling their own. Similar problem with kitchen-sink libraries like Qt. "batteries included" languages lead to higher maintenance burden for the core team, and hence various costs that all users pay: dollars, slow evolution, design overhead, use of lowest common denominator non-specialised implementations, loss of core mission focus, etc.
It's a tradeoff. Those languages also have a very difficult time evolving anything in that standard library because the entire ecosystem relies on it and expects non-breaking changes. I think Rust gets sort of best of both worlds because dependencies are so easy to install it's almost as good as native, but there's a diversity of options and design choices, easy evolution and winners naturally emerge - these become as high quality as a stdlib component because they attract people/money to work on them but with more flexibility to change or be replaced
> If it was pulled into Rust stdlib, that team would be stuck handling it, and making changes to any of that code becomes more difficult.
I think Rust really needs to do more of this. I work with both Go and Rust daily at work, Go has its library game down -- the standard library is fantastic. With Rust it's really painful to find the right library and keep up for a lot of simple things (web, tls, x509, base64 encoding, heck even generating random numbers.)
I disagree, as I see it Rust's core-lib should be to interact with abstract features (intrinsics, registers, memory, borrow-checker, etc), and std-lib should be to interact with OS features (net, io, threads). Anything else is what Rust excels at implementing, and putting them into stdlib would restrict the adoption of different implementations.
For example there are currently 3, QUIC (HTTP/3) implementations for rust: Quiche (Cloudflare), Quinn, S2N-QUIC (AWS). They are all spec compliant, but may use different SSL & I/O backends and support different options. 2 of them support C/C++ bindings. 2 are async, 1 is sync.
Having QUIC integrated into the stdlib wouuld means that all these choices would be made beforehand and be stuck in place permanently, and likely no bindings for other languages would be possible.
Gilad Bracha has a really interesting approach to sandboxing third party libraries: Remove imports, and do everything with dependency injection. That way if you never inject say the IO subsystem, the third party code won't be able to break out. And there's no overhead, since it's all based on capabilities.
Even cooler, if you want to only expose read operations, you can wrap the IO library in another library that only exposes certain commands (or custom filtering, etc).
EDIT: I should say this doesn't work with systems programming, since there's always unsafe or UB code.
Yes, but a lot of the complexity is unnecessary bloat. Almost every project I've ever seen or worked on was full of unnecessary complexity. People naturally tend to over-complicate things, all the programming books, including software design books focus on unimportant aspects and miss all the important ones. It's incredibly frustrating.
Yet, if someone were to write a book which explained things properly (probably a 3000 word article would suffice to turn anyone into a 10x dev), nobody would buy it. This industry is cooked.
Maybe we should have a way to run every single library we use in an isolated environment and have a structure like QubesOS. Your main code is dom0 and you can create bunch of TemplateVMs which are your libraries and then create AppVMs for using those libraries. Use network namespaces for communicating between these processes. For sensitive workloads (finance, healthcare, etc), it makes sense to deploy something like that
Regardless of language, really? I highly doubt that, you don't generally see such problems with C or even C++ because dependencies are more cumbersome to add, especially in a way that's cross-platform.
With C++ it's hilarious because the C++ community is so allergic to proper dependency management and also so desperate for stuff from third party libraries that the committee spends large amounts of its time basically doing dependency management for the community by baking in large features you'd ordinarily take as a dependency into the mandatory standard library.
I'm sure I'll miss some, but IIRC C++ 26 is getting the entire BLAS, two distinct delayed reclamation systems and all of the accompanying infrastructure, new container types, and a very complicated universal system of units.
All of these things are cool, but it's doubtful whether any of them could make sense in a standard library, however for C++ programers that's the easiest way to use them...
It's bedlam in there and of course the same C++ programmers who claim to be "worried" that maybe somebody hid something awful in Rust's crates.io are magically unconcerned that copy-pasting tens of millions of lines of untested code from a third party into absolutely every C++ program to be written in the future could be a bad idea.
> copy-pasting tens of millions of lines of untested code from a third party into absolutely every C++ program to be written in the future could be a bad idea.
Is it really that bad? (By my count, as a point of reference, the Python 3.13 standard library is just under 900k lines for the .py files.)
If something is in the standard library, then it’s written and vetted by the standard library provider, not by a random third party like you make it sound.
Maintainers of all open source standard libraries are effectively "random third parties". With heavily used ecosystem dependencies (such as Tokio, but also swaths of small libraries, such as `futures` or `regex`), the number of people who have looked at the code and battle-tested it is also huge.
On crates.io, a good heuristic is to look at two numbers: the number of dependents and the number of downloads. If both are high, it's _probably_ fine. Otherwise, I'll manually audit the code.
That's not a complete solution, especially not if you're worried about this from a security perspective, but it's a good approximation if you're worried about the general quality of your dependencies.
> it’s written and vetted by the standard library provider, not by a random third party
All three modern C++ standard libraries are of course Free Software. They are respectively the GNU libstdc++, Clang's libc++ and the Microsoft STL. Because it's a huge sprawling library, you quickly leave the expertise of the paid maintainers and you're into code that some volunteer wrote for them and says it's good. Sounds like random third parties to me.
Now, I'm sure that Stephan T. Lavavej (the Microsoft employee who looks after the STL, yes, nominative determinism) is a smart and attentive maintainer, and so if you provide a contribution with a function named "_Upload_admin_creds_to_drop_box" he's not going to apply that but equally Stephen isn't inhumanly good, so subtle tricks might well get past him. Similar thoughts apply to the GNU and Clang maintainers who don't have funny names.
One Stephan T. Lavavej is worth more than 1000 random github rustaceans, some of which will be bots, AIs, rank amateurs, bought or North Korean spies. Any of the libraries has one or more Stephans.
Having paid maintainers, code review, test suites, strict contribution guidelines, etc is state of the art for open source software that some transitive crate dependency can only dream to achieve.
Because most dependencies are either manually installed by the user, or are dynamic libraries that are provided and audited by the distro maintainers. The dependencies are there, they're just harder to see - https://wiki.alopex.li/LetsBeRealAboutDependencies
Sure, there are various dependencies, but it's nothing like "cargo install crate-name". Cargo makes it so effortless to joink the dumbest dependency for the simplest thing.
On the other hand, C/C++ makes it attractive to reinvent the wheel, or vendor the dependency instead. Rather than a single well-tested implementation in the ecosystem for something like sha256, you end up with every application having its own slightly-different, mostly untested, and essentially unmaintained version.
Applications still need the functionality. The need doesn't magically disappear when installing dependencies is a pain. If a crate has a bug, the entire ecosystem can trivially get the fixed version. If the Stackoverflow snippet a C app is vendoring has a bug, that fix is never getting in the app.
That does not help you if the bug is one of many unmaintained crates and never noticed. Linux distributions aim to make sure that C application dynamically link to the right libraries instead of vendoring the code. Then the library can be updated once. IMHO this is the only reasonable approach.
> Sure, there are various dependencies, but it's nothing like "cargo install crate-name".
You don't install a Rust crate to use it. We have enough people in this thread trying to authoritatively talk about Rust without having any experience with it, please don't bother leaving a comment if you're just going to argue from ignorance.
Sure, despite all the hate it gets, except for IDE project files, it is the best experience in C and C++ build tools since forever, including IDE integration just like those project files.
I thought the whole UNIX mentality was worse is better.
No build tool is without issues, my pain points with cargo, are always compiling from source, build caching requires additional work to setup, as soon as it is more than pure Rust, we get a build.rs file that can get quite creative.
I also don't understand the CMake hate. Modern CMake (3.14+) is just around 10 lines to build basic sources/libraries/executables. And you can either use CMake FetchContent or use CPM https://github.com/cpm-cmake/CPM.cmake to fetch dependencies. No third-party tool like vcpkg or conan is needed.
I think CMake is the perfect balance. You need to write few lines and think about few things before adding a dependency, but usually nothing too crazy. It might not work the first try but that's okay.
In the absence of a technical solution, all others basically involve someone else having to audit and constantly maintain all that code and social/legal systems of trust. If it was pulled into Rust stdlib, that team would be stuck handling it, and making changes to any of that code becomes more difficult.