I'm part of HN, my general attitude here is that the maintainers of the consuming software are at fault.
If you have so many dependencies (especially in javascript, hah) that you can't keep track of what's happening with them then either your scope is too big or you're just carelessly glueing things together because you can't be bothered to build sane things on your own.
I understand using things like jasmine and one or two other big things, but at that point those big things are critical components of what you've built and you need to be paying close attention to them for the duration that you're responsible for the maintenance of the code.
TL;DR: the fault is with the consumers for being careless, not the people who actually put work in.
But you can't hope to watch every dependency of every dependency of every dependency all the way down, at some point you need to trust that the code will do what it says.
Dependencies don't just end at a package.json. Their dependencies matter, as do theirs. The C++ code use in the javascript engine is a dependency, as is the python code they use during the build process, which itself relies on Python and their interpreter, which is again more C code, which means you need a compiler, which also has a large suite of dependencies and relies on an OS to provide many things, which themselves rely on more compilers and eventually machine code that should be consumed correctly, and that depends on the microcode running in the CPU to be created correctly and without any vulnerabilities. Of course that microcode is generated using more higher-level code, and the cycle continues.
You literally can't check everything, and I'd argue that you can't even check more than 2 steps out from your own code realistically.
That's not to say you shouldn't try, but you can't just pretend that it's the fault of the consuming dev that they got exploited just like you can't pretend that it's the fault of the original creator of the code who doesn't want to maintain it any more.
These things are going to happen, and I think we would all be better off if we just accepted that, stopped trying to place blame on a specific person or group, and instead continue working on making it harder to accomplish those bad things. Otherwise we get into a situation where "the only winning move is not to play".
No you absolutely do need to trust the dependency code unless you're working on a toy application. If you're building something real where reliability and security actually matter then you have to either verify the dependencies yourself, or actually pay vendors to do it for you and get written contracts for support. Unfortunately many developers have fooled themselves into believing they can get something for nothing, and then act surprised when the inevitable consequences hit.
So how many layers of verification do you or the companies you work at pay for? How many layers of dependencies do you pay someone to verify? Do they pay others themselves to verify the layers below them?
What do you do when someone has a dependency on something like OpenSSL, GCC, V8, Chromium, or even Linux? It's easy to say "oh well just pay someone", but who would I pay, and what would I pay them for?
We use npm to install 2 dependencies for a project i'm currently on. I can vet those 2 dependencies, and luckily in this case they are simple and each only depend on another 4 or 5 themselves, and in total I have 20 deps. I can review that, and it all looks good (for the sake of discussion let's assume I can actually carefully review and understand every line, when we both know that's not possible in the real world).
Now what? Well I'd also need to validate that the tool `npm` isn't hacked, and now that's a massive amount of work, then throw on node on top, and then v8 on top of that. So let's throw away all of javascript because I don't have a few hundred million dollars laying around to spend on auditing the entire javascript ecosystem.
So what platforms do you use? C/C++ are out of the question, GCC is massive and I don't think I have the money to pay for audits for that either, let alone libc and OS kernels. So does that rule out everything except for small embedded systems and custom designed chips? Maybe there are some assemblers out there that can compile some simple code and are small enough to be audited by a smaller company. I mean you'd still have supply chain attacks that you need to protect against and verify, but hey those are easy to fix right?
Or do those dependencies not count for some reason?
And that's even assuming that an audit of a codebase will catch much of anything, which we both know they won't in all cases. So then what do you do when one slips through the cracks? Find someone to blame like that'll help anything? Tell the company that got owned that if they just added a few more zeros to the end of the check they wrote someone would have caught this?
This "just verify it" stuff gets so tiring, it's literally not possible.
Maybe I'm really living in a bubble of the wild west of computer programming, but somehow I just can't see this magical world just out of reach where everyone else verifies and vets every single dependency and that chain continues unbroken all the way down to the firmware and even the hardware and nobody ever makes mistakes and nobody ever pushes changes out at any level without hundreds of people reviewing and improving and fixing and re-verifying every single change.
The idea that anything even close to resembling this happens in the real world is so laughable to me, it's just the same old "only REAL programmers do x" gatekeeping that is so prevalent in this industry. And this is exactly the kind of thing that makes me not want to contribute back any changes. Because god forbid I don't spend wheelbarrows of cash on vetting every single part of my stack, and that my code isn't absolutely perfect in every way, or I'll be dragged across the coals and made into some kind of demon for having done something so stupid that a "real programmer" would obviously never do.
This is how you end up with stuff like QNX. It's really not nice to program for but you have a whole company who's job it is to produce a set of more or less verified dependencies.
Also: there are compilers other than gcc that are much smaller.
It's not complicated. You can pay Red Hat or one of their competitors for Linux verification and support (including security patches for included libraries), and in turn they'll pay whichever other vendors are necessary to deliver on that contract. Get it in writing. For accountability you need a "single throat to choke".
If you're using npm for something important then just realize that you're taking a huge risk and don't be surprised when it blows up in your face. Most of the discussions on HN are about toy applications which don't really count for anything. I would be surprised if the Visa payment processing system uses npm.
I'm part of HN, my general attitude here is that the maintainers of the consuming software are at fault.
If you have so many dependencies (especially in javascript, hah) that you can't keep track of what's happening with them then either your scope is too big or you're just carelessly glueing things together because you can't be bothered to build sane things on your own.
I understand using things like jasmine and one or two other big things, but at that point those big things are critical components of what you've built and you need to be paying close attention to them for the duration that you're responsible for the maintenance of the code.
TL;DR: the fault is with the consumers for being careless, not the people who actually put work in.