I've seen this trip up people in the past, in one case a CI/CD system running Linux was used to produce a project deployed to a mostly windows environment this didn't cause any issues until the day a developer added a binary module. Honestly, I'm surprised it worked as long as it did, it took a little over a year before anyone hit that issue.
No, and that detail is dangerously missing from the advice. lol
Anyone that does this and has teams on Mac, Windows and Linux will find out how crappy this is very quickly.
It’s even worse than that. If you support all LTS/current Node versions—as is, and should be, very common for libraries—it'll break even on a single platform. I’m sure it’s solvable, but the solution would be so complex it’s tantamount to building a new package manager.
> Would this still be an issue if the whole team uses docker to run the code?
It could be if it's a CPU architecture difference. For example an M1 Mac (ARM64) vs just about every other system (x86-64).
I know we had to switch out MySQL with MariaDB locally because the official MySQL Docker image doesn't support ARM64 devices but MariaDB does. That's just another example where even if you're using Docker there could be differences.
We've also had issues where developers aren't used to case sensitivity at the file system level and things work on their Mac but fail on Linux in CI because Docker's bind mounts (often used in dev) will use file system properties from the host OS which means even if your app runs in Linux within a container it may run differently on a macOS host vs Linux.
The moral of the story here is Docker is good but it isn't a 100% fool proof abstraction that spans across Linux, Windows and macOS on every combination of hardware.
Some modules straight up download binaries so I don’t see being so straightforward.