I think it's not bashing of npm specifically so much as it is the node ecosystem it serves and depends on; at least in my mind it's difficult to separate node from npm. That said, for what it's trying to do (read a list of deps, resolve vs. registry, download and unpack) it seems to do a fine job of it.
My major complaint about npm is the choice to allow version range operators on dependency declarations. We know the node.js ecosystem places a high value on composability, so using lots of tiny modules which themselves depend on lots of tiny modules is the norm. This is a problem though because range operators get used liberally everywhere, so getting reproducible builds is like winning the lottery.
There are other things I don't like about using npm: node_modules/ is big and has a lot of duplication (even with npmv3), it's pretty slow, historically it has been unstable, its still crap on Windows, etc. - but for someone who has 'ensures reproducible builds' as part of their job description, the way its modules get versioned is its worst feature.
For reproducible builds (or at least 'to get the same versions again') you should be using 'npm shrinkwrap'. (Of course there's probably more you should do to get true reproducible builds, but that goes for any package manager).
The range operators are important, else you'd never be able to resolve 2 packages that want a similar versioned sup-dependency e.g. jquery 1.12 because without range operators those 2 packages would have declared minor version differences (1.12.1 and 1.12.3) depending on when they were published. This would mean you'd always end up with duplicated dependencies.
I'd argue 'node_modules is big' is not a fault of npm. If the package or app you're trying to install generates a large node_modules dir, that is something you should take up with the package maintainer. See buble vs babel - buble has a way smaller dep tree.
npm is only slow in the ways that all other package managers are, when installing large dependency trees or native dependencies (like libSass) and it is way faster than say pip and rubygems in this regard. When I 'pip install -r requirements.txt' at work, I literally go and make a coffee.
Also never experienced any instability, though I may have been lucky. Certainly it has been very stable for the last year or so when I've been working with a lot. Could you elaborate on why it is crap on Windows? I did think all major issues (e.g. deep nesting problem) were now fixed ...
The main problems we ran into with shrinkwrap were:
It shrinkwraps everything in your current node_modules directory.
This includes platform specific dependencies that may not work on other platforms but now will cause npm install to fail instead of just printing a message about it.
So our current workflow has to be:
1. Update package.json
2. rm -rf node_modules/
3. npm install --production # This doesn't include any of those pesky platform specific packages
4. npm shrinkwrap
5. npm install # Get the dev dependencies
As far as the other comments about npm, I just generally have more problems with it than rubygems/bundler and the general OS package managers.
Shrinkwrap is ridiculous. I'm expected to go look at every resolved dependency and individually add them if I want to update or not? No thanks; one app at my workplace defines ~50 top level dependencies, but this balloons to almost 1300 - and this is with npm v3 - after npm install. Ain't nobody got time for that.
Deep nesting is not 'solved' it just doesn't happen 100% of the time anymore. If you have conflicts, you still have deep trees. I suppose range operators help with this a little, but looking at what gets installed it doesn't seem to help that much; I still have duplicated dependencies.
I was mentally comparing npm to tools like maven, ivy and nuget, all of which are faster but also not interpreted. Not a fair comparison I guess.
> Shrinkwrap is ridiculous. I'm expected to go look at every resolved dependency and individually add them if I want to update or not?
Not sure you're aware of the suggested flow (see here [1]), but it isn't ridiculous. Use 'npm outdated' to see which packages are out-of-date and 'npm update --save' to update a dep (and update the shrinkwrap file).
Keeping track of stale sub-dependencies is a problem in and of itself, but again that exists with any package manager. (Because you will always need to pin dependencies before you go to prod right). So that 'lockfile' will get out of date pretty fast. Node at least has solutions for this that other communities don't [2] (I haven't tried this service).
(Aside from the lack of package integrity check, which I'll grant, sucks).