C++ has editions too btw. C++11, C++14, C++17, etc. These are opt in and allowed to break compatibility, although that is very rarely done in practice.
That's one difference. And the other important differences are:
- Rust apps can depend on library "headers" written in other editions. That's the whole deal with editions! Breaking changes are local to your own code and don't fracture the ecosystem.
- Rust has a built-in tool that automatically migrates your code to the next edition while preserving its behavior. In C++, upgrading to the next standard is left as an exercise for the reader (just like everything else). And that's why it's done so rarely and so slowly.
Not really, because the crates they depend on cannot expose APIs with semantic changes across versions.
Also it requires everything to be compiled with the same compiler, from source code.
There are tools available in some C++ compilers for migration like clang, note the difference between ISO languages with multiple implementations, and one driven by its reference compiler.
> Not really, because the crates they depend on cannot expose APIs with semantic changes across versions.
Not sure what you're talking about. Any specific examples?
> Also it requires everything to be compiled with the same compiler, from source code.
It's not related to editions at all. It's related to not having an implicit stable ABI.
It's possible have a dynamic Rust library that exposes a repr(C) interface, compile it into an .so using one version of the compiler, and then compile the dependent "pure Rust" crates using another compiler that's just going to read the metadata ("headers") of that library and link the final binary together. Same as in C and C++. You just can't compile any Rust code into a stable dynamic library, by defalut. (You can still always compile into a dylib that needs a specific compiler version)
As the other commenter responded there, your example isn't about editions at all. It's about mixing ABIs and mixing multiple versions of the language runtime. Those are entirely separate issues.
You're correct that the possible changes in editions are very limited. But editions don't hinder interoperability in any way. They are designed not to. Today, there are no interoperability problems caused by editions specifically.
> compromises will be required, specially regarding possible incompatible semantic differences across editions.
That's just an assumption in your head. 4 editions later, it still hasn't manifested in any way.
4 editions later, Rust is yet to be used at the scale of C and C++ across the industry, my point on the comment is how editions will look like after 50 years of Rust history.
ABIs and multiple versions of the language runtime, are part of what defines a language ecosystem, hence why editions don't really cover as much as people think they do.
Editions will look like band-aids that don't fully solve the cruft accumulated over the 50 years. That's not hard to predict. It's still a very useful mechanism that slows down the accumulation of said cruft. I'm yet to see a language that has a better stability/evolution story
In any case, while I as language geek have these kind of discussions, given current progress in AI systems, my point of view is that we will get the next evolution in programming systems, thus it won't matter much if it is C, C++, Rust, C#, Java, Go or whatever.
We (as in the industry) will eventually get reliable ways to generate applications directly to machine code, just as optimizing compilers took a couple of decades to beat hand written Assembly, and generate reliable optimized code.
So editions, regardless of what they offer, might not be as relevant in such a timeframe from a couple of decades ahead.
The machine code will always be generated from "something". A one-line informal prompt isn't enough. There will always be people who write specs. Even the current languages are already far from "machine code" and could be considered "specs", albeit low-level
As the response to that comment points out, you are confusing editions for ABI changes. Different editions are purely a source-level change in the language. All editions are ABI-compatible.
Not really, because one thing that apparently I haven't gotten across is that they don't cover semantic changes, only grammar ones for the most part.
What is the compiler supposed to generate if code from edition X calls code in edition X + 10, with a lambda written in X + 5, expecting using a specific language construct that has changed semantics across editions, maybe even more than once?
> one thing that apparently I haven't gotten across is that they don't cover semantic changes, only grammar ones for the most part.
You have gotten it across just fine.
We're trying to get across that no one is ever going to do "global" incompatible semantic changes in editions. That's been understood from the start. Exactly because of the problems that you describe.
It would work as intended? I’m not sure what problem you are trying to point out. The lambda would compile in the edition in which it is written, resulting in a code unit passed to the other crate(s). As mentioned, the ABI is stable across editions. To put in your words, there are no semantic changes to the ABI across editions.
C++ shipping new and slightly incompatible versions of the entire language every three years isn't Editions, there was a proposal to attempt Editions (under the name "Epochs") for C++ but it faced significant headwinds and was abandoned.