As I mentioned it is mostly just packing a runtime with the binary.
But for most use cases, does it matter? It won't work well for microcontrollers, but for most other things a 60MB binary is not a big deal. It takes under 3 seconds to download on an average internet connection.
The main reason I wouldn't use it for system programming is lack of good Linux libraries.
It's an insignificant cost. Nobody cares about it at any scale except embedded development (or occasionally due to artificial restrictions created by app stores).
The money saved by shaving a few MB off your binary is going to always be like 0.00001% of your costs, and even if you're at such a massive scale where that could justify developer investment, there's going to be lower hanging fruit to worry about.
If you're at such a level of optimisation that it's correct to worry about deploying a 60MB binary, you must be doing exceptionally well. Most places (including Amazon and Google) are using 200MB+ docker images and serving poorly optimised image files.
And if it _really_ bothers you, you can always install the runtime separately for a sub-1MB binary, or use trimming for a sub-10MB binary.
and it tells a lot about the mentality of the developers/company
60mb, now CDN server need at least 60mb of storage to cache your file
compare this to a developer that care and will produce the same with a native language for 6mb, that's 10x less the space needed
it'll cost CDN 10x less money to store and serve since they'll need 10x less storage and will require less bandwidth too
for you single dev, you apparently don't care, but it then hurts everyone, including global warming issue
and at the end, it'll hurt me, now my 128gb SSD isn't enough to store all the electron shit, i need upgrade to install new update
cpu and memory also has to take longer to load and cache all your program
it is the reason why software sucks despite hardware becoming better, nobody care anymore
and it all start at the developper computer, if you don't care, nothing will improve, and every bits matter, either you like it or not, that's how computers work
and idk why there is this culture in "IT" of not wanting to acknoledge how computers work, and how bloat affects performance, efficiency, cost and global warming
i came to the conclusion that clueless people managed to get into high positions, and wanted to secure their position by hiring the same kind of people, the people who don't give a damn about performance problems
they'll just waste (again the word waste) more funds from the company
after all, if nobody know how computer works, they'll just agree and sign the big invoice without asking themselves, wait, don't we pay too much to deliver this web page, do we really it to be 260mb?
Indeed they will. Gaming compiler benchmarks has been going on since the early 1980's.
Datalight C was the first C compiler for DOS to do data flow optimizations. Since the benchmarks of the time simply did some operations and ignored the result, the data flow optimizations would delete the benchmark code.
I was accused of gaming and cheating on the benchmarks. After they published, of course. Sigh. It wasn't long before other compilers did DFA, and the benchmarks (of course) were changed.
60mb + for a HTTP server is not what i call a single binary without runtime
at best it is called packing giant runtime together with user executable hack
and their AOT story is far from being perfect, it has to deal with the fact that C# is a managed language first
so exit system programming needs, it's just not playing on the same field
c# is aimed at webdevs and game scripters with unity