Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're drastically overestimating the storage requirements. Time is far, far more valuable than storage savings. The point about CI builds running faster is enough to sell the idea all by itself.

Oddly not mentioned in the article is that it allows you to build when npmjs.org is down or unreachable, which happens often enough to be frustrating, and if it happens when you're trying to deal with an emergency, it's downright infuriating.



We cache the directory on CI and get the same benefits without needing to make our repo massive.


Git is good at storing text diffs, any binary file in your node_modules (images, natives etc) are permanently stored in your repo, including old versions or deleted files. I've seen times where this had meaningful impact on both disk use and the speed of Git itself


We run an npm server that proxies request to npmjs.org and caches what it got.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: