Languages like Rust have solved this through the concept of editions (or whatever you want to call it), essentially package or module-level “compilation mode” which allows evolving the language without breaking the old code.
The issue of the Python transition is that it was way too massive and the repercussion were way too widespread (e.g. they necessarily leaked into the APIs) for this to be possible, despite Python having an entire mechanism to handle this with `from __future__` (though admittedly that being a file-level attribute makes it less than ideal to move the language forwards).
If you can make compiler work with both - new and old code, then yes, I totally agree. But Python expected you to transpile your code to new version and recompile while also updating many of your dependencies. And that’s not something you easily can do in large codebases.
I’m not sure how editions in rust work, but if compiler can compile all editions and link them in single binary, then its something I’m totally for. But if someone expects people to go through millions of lines of old code and make sure it all works fine after migration to new version of lang, then I’m not sure your language can succeed.
> If you can make compiler work with both - new and old code, then yes, I totally agree.
Which you can, technically even Python has that mechanism (`from __future__ import <thing>` can change the language syntax).
However the ability to use that for the 2/3 transition was basically nil as an enormous number of apis were impacted, the syntax changes were really the easy bits, the semantics were much more difficult to deal with (talking from experience).
> I’m not sure how editions in rust work, but if compiler can compile all editions and link them in single binary, then its something I’m totally for.