Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Benchmarks of JavaScript Package Managers (pnpm.io)
27 points by gslin on Feb 28, 2023 | hide | past | favorite | 20 comments


For me the biggest advantage of using pnpm is that it doesn't fill up my hard disk.

From their main page -

"Files inside node_modules are cloned or hard linked from a single content-addressable storage'


Yarn also does this very well & pioneered reconsidering how we store stuff, I believe.

But they broke the universe when they switched it on by default in v2, in ways I'm not intimate with. v3 maybe possibly makes things somewhat better[1], started looking at ways to be flexible/allow different strategies.

I do care about speed a good bit, as I've seen more big monorepos happen. At first I was a bit sad that Yarn v2+/pnp was beating Pnpm "with cache, with lockfile" by so much, but pnpm's "with node_modules, with lockfile" is basically the same speed. The first is more the "get started with cache" case (and what yarn v2+/pnp does every time there there-out), but the second is what most day to day dev really is: the speed converges totally in the day to day. A zoom in on the graph might reveal some differences perhaps; looks like ~2s, which is a small bar on this graph. Also pnpm is like 2x as fast if you don't have cache, so actual first install win may well go to pnpm again.

[1] https://dev.to/arcanis/yarn-3-0-performances-esbuild-better-...


npm used to be super slow but it's not the case anymore. I admire the efforts around optimising package managers, but when I see these benchmarks I don't feel like using another non standard package manager, to save a few seconds once in a while.


Some years back I got burned repeatedly by people chasing the "yarn is fast and is the future! NPM is dead!" trend. At one point it was a running joke that one of our projects hadn't really started until we hit a bug or missing feature in Yarn the easiest fix for which was "run 'npm install' and stop using Yarn". This is well after the entire Internet had decided Yarn was a perfect drop-in replacement and declared NPM obsolete.

I'd be more interested in how many packages break on each one—at install, or when you try to actually use the package—than speed. Maybe that's zero for all contenders now, but it wasn't for Yarn back then. Doesn't take many hours lost to fighting your package manager before any time-savings from execution speed have been erased.


Yarn definitely shot themselves in the foot badly. PnP identified real problems & came up with a solution, but pnpm is doing a similar set of tricks but in a Node-ecosystem-compatible way, with next to no compatibility issues (versus package maintainers having to each individually support Yarn V2 PnP). Yarn V2 seemingly thought they could get the entire npm package world to switch to yarn, saw their growth & saw the thought-leaders & decided their winning was a fait-accompli.

And they didn't really execute very well... v2 landed, there was controversy, and there's been so little visible or exciting good news about it. It over-played Yarns so hand they renamed Yarn v2 as Berry, just to re-gather the troops & make a staging point forward. But it's still an incredibly hard pill to swallow, and the "yarn (berry) is great, the ecosystem needs to change" attitude seemingly isn't gaining any traction and it's hard to tell where Yarn could go.

In Yarn v3[1], they've introduced a modular "linker" system for how to install packages, that seemingly might get them able to experiment around/play around a little more & be less constrained than the hard-path they'd crusaded for.

One thing I will say for Yarn, that makes me unbelievably happy versus npm (announced during the V2[2] announcement):

> Yarn is first and foremost a Node API that can be used programmatically (via @yarnpkg/core)

Npm is the premier tool for open-source javascript, but it itself is one of the least open-source efforts on the planet. I finally started digging around the npm package and it's underlying cacache cache-structure, and it's just infinitely unpleasant to get started with. There's maybe like 3 articles on the whole planet that have any guidance for what npm is inside, how it works, what you can do with it, how you can hack it. Yarn identifying that the package manager itself is something that developers need access to is a huge win & I want to thank them forever for putting that on their bullet list of great Yarn things.

Pnpm by compare has a couple super simple things one can do with it as a library[3] - install, link, look at dependencies. But, like, looking at the cache & seeing whats there? Nope. Good luck. Like npm, it's a wonderful tool for open source but it's quite a poor showing as open source, doesn't make itself programmatically accessible.

[1] https://dev.to/arcanis/yarn-3-0-performances-esbuild-better-...

[2] "Yarn's Future - v2 and beyond" https://github.com/yarnpkg/yarn/issues/6953

[3] https://github.com/pnpm/pnpm/blob/main/pkg-manager/core/src/...


Yeah agreed, I'd much rather use something that's standard and installed alongside every other Node.js install. Rather than put my trust in another package manager for 10s of improvement on a completely clean install (which isn't something that I have to do very often).


Fwiw, you can alias npm=pnpm and only ever get its benefits. You can also "import" yarn/npm lockfiles into pnpm lockfiles.


I see that alias as a source of potential issues not worth the few seconds.


Maybe, but the historical design decisions of npm when it comes to (lack of) repeatability also through time, because we don't only need to build the latest hotness on the trunk, makes me avoid npm.

Yarn has always been more sane in this regard.


Work is finally switching our mono-repos to pnpm. In progress, code-freeze end of the week to pull it off. We all expect vastly improved quality of life.

Had been using lerna, which we have to re-bootstrap way too often & which takes way to long to do so.


I switched all my work monorepo projects to pnpm a few months ago, and it has been a dramatic improvement. Many thanks to the author(s) of pnpm for their hard work!


Isn't lerna basically just leaning on npm workspaces nowadays?


"Lerna is dead, longive Lerna"[1] mainly talks about Nx, the DAG-based parallel task runner that the new maintainers already made/love, and how it will maybe/probably help Lerna. The post explicitly mentions Nx is neutral & can work with a variety of monorepo tools (also mentioning pnpm, yarn, in additiom to npm).

Lerna is old as heck, & far predates npm workspaces. Im not super aware what if any npm workspaces support it has or had. To be honest we were so far behind upgrading/maintaining stuff when I joined that we only recently have a npm version with workspaces (released in v7, October 2020) so Im not super well versed.

The new lerna versions definitely configure package.json#workspaces, which seemingly all js-world monorepo tools use.

I've heard npm is getting much better, but just going through our many packages & installing them & doing the necessary mono-repo npm link has been absurdly slow with our old-ish pre-workspace npm; lerna has been so saddening. Nx can help here, and maybe npm indeed is much more competitive now. I still find it frustrating that enormously simple stuff like "how do I update a dep in a package" has no answer[2] (and requires rebuilding that painful slow re-install everything in every route I've tried), but maybe the monorepo tools now can do this & it's no longer as much a lerna question?

[1] https://blog.nrwl.io/lerna-is-dead-long-live-lerna-61259f97d...

[2] https://github.com/lerna/lerna/issues/2142


Just double checked. Unless you specifically turn it off (reverting to Lerna's old behavior), it does leans on npm/pnpm/yarn's workspaces. Npm is indeed getting better at this stuff too. I did some tests on our main monorepo (~8 services linked together), for us npm was marginally slower than pnpm and pretty much equivalent to yarn.

It also does optionally lean on Nx for task execution. So at that point, lerna itself isn't doing much outside linking internal dependencies.


This is nice, and congrats to the team. However, I've tested extensively npm, pnpm, yarn and even bun for projects that involved building new images regularly. They all worked faster than npm, but I found minor quirks on specific libraries or due to the constrained environment that didn't make the switch worth it.

I wish the open source management/governance/incentives could allow for either better standardization or more upstream contributions. Meanwhile, I continue to recommend to my students to go with the most standard tool.


I'd love to see bun in this benchmark

[edit] I ran this experiment just for fun:

  bun install  1.09s user 3.68s system 80% cpu 5.898 total
  pnpm install  8.75s user 21.18s system 195% cpu 15.346 total
  yarn install  11.90s user 14.19s system 51% cpu 50.929 total
  npm install --force  15.58s user 8.27s system 50% cpu 47.140 total

  bun -v
  0.5.7

  pnpm -v
  7.27.1

  yarn -v
  1.22.19

  npm -v
  8.15.0

I had to run npm with --force flag, as it would fail out of the box:


Using pnpm in our current project. Huge challenge to use this tool and distribute application in docker containers. The intention of “saving” disk space on a developer machine is a bad tradeoff compared to the extra effort to figure out how to pack the many symbolic links pnpm creates.


What about ignoring pnpm and using npm on the docker deployment?


You really need a lockfile for reproducible builds. pnpm's lockfiles are not compatible with npm's.


pnpm works just fine in docker, though? All files are installed to the .pnpm folder inside of node_modules, and then all the packages are symlinked to that folder. In other words, as long as you copy node_modules there is no problem, which is what you would normally do anyway...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: