> the "winning" strategy is to have high-level scripting languages where you can ignore most of the details, which call into hyper-optimized, high performance libraries. For instance, when you're using Scipy, you call into Fortran and C almost interchangeably.
Well, no. This is python's strategy. Doesn't make it the winning strategy. Python implicitly forces multiple languages upon you. A scripting one, and a performance one. Meanwhile languages such as Julia, Rust, etc. allow you to do the work in a single (fast/compiled) language. Much lower cognitive load, especially if you have multiple cores/machines to run on.
Another point I've been making for 30+ years in HPC, is that data motion is hard. Not simply between machines, but between process spaces. Take large slightly complex data structures in a fast compiled language, and move them back and forth to a scripting front end. This is hard as each language has their own specific memory layout for the structures, and impedance matching between them means you have to make trade-offs. These trade-offs often result in surprising issues as you scale up data structure size. Which is one of the reasons that only the simplest of structures (vectors/arrays) are implemented in a cross language scenario.
Moreover, these cross language boundaries implicitly prevent deeper optimization. Which leads to development of rather different scenarios for code development, including orthogonal not-quite-python based things (Triton, numba, etc.).
Fortran is a great language, and as one of the comments pointed out, its really not that hard to learn/use. The rumors of its demise are greatly exaggerated. And I note with some amusement, that they've been going on since I've been in graduate school some 30-35 years ago. Yet people keep using it.
Well, no. This is python's strategy. Doesn't make it the winning strategy. Python implicitly forces multiple languages upon you. A scripting one, and a performance one. Meanwhile languages such as Julia, Rust, etc. allow you to do the work in a single (fast/compiled) language. Much lower cognitive load, especially if you have multiple cores/machines to run on.
Another point I've been making for 30+ years in HPC, is that data motion is hard. Not simply between machines, but between process spaces. Take large slightly complex data structures in a fast compiled language, and move them back and forth to a scripting front end. This is hard as each language has their own specific memory layout for the structures, and impedance matching between them means you have to make trade-offs. These trade-offs often result in surprising issues as you scale up data structure size. Which is one of the reasons that only the simplest of structures (vectors/arrays) are implemented in a cross language scenario.
Moreover, these cross language boundaries implicitly prevent deeper optimization. Which leads to development of rather different scenarios for code development, including orthogonal not-quite-python based things (Triton, numba, etc.).
Fortran is a great language, and as one of the comments pointed out, its really not that hard to learn/use. The rumors of its demise are greatly exaggerated. And I note with some amusement, that they've been going on since I've been in graduate school some 30-35 years ago. Yet people keep using it.