I have seen colleagues have to do very similar things on embedded platforms, when there is a large pressure on price, a smaller amount of flash on an MCU will make the MCU cheaper.
> Is this ever an real issue, even on any embedded system in the last 20 years?
Ask Cisco when they cut the Linksys routers' RAM in half a few years ago. Every byte counts. Component cost savings add up when you make a few million of them.
This is true, but the stdlib provided by the compilers aimed at these chips is usually very bare bones and size-optimized to begin with. So it's likely hard to save much space by reimplementing subsets of it.
Honestly, if I can, I try to cut down on sizes of compiled binaries and even minified and combined web app code. Sure many times it's unnecessary but if I can save on bandwidth and memory typically that means I'm including less crap that can have problems, too. I see it as including far more benefits than simply getting a file size smaller.
But sometimes it can take so much extra work that it's not worth it. So gotta do the cost benefits analysis or hell if it's something you're interested in doing just do it anyway.
Yes, executable size is still a modern day problem. Imagine you're shipping an OS; do you want the hundreds of thousands of executables in your system to all be a few percentage larger when you're trying to ship to a customer who may only have 16 or 32 GB of storage space?
Of course, removing libc won't be your first (or second or third or ...) step for removing bloat from a mature codebase.
If you'r trying to reduce bloat in this OS would be dynamically linking anyway wouldn't you? I've installed those 300MB printer drivers so I'm all for reducing bloat. This just seems to be on the extreme end of things.
Is this ever an real issue, even on any embedded system in the last 20 years?