Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The JS community has no sense of quality. The community doesn't value things that are well abstracted or work well. I dread every moment I have to work in JS because everything is so badly done.


A lot of people blame the JavaScript language itself but, the longer I'm around in the world of web development, the more I think that the quality of JavaScript applications is dictated by the economics of said applications.

Off the top of my head, the best software I use seems to fall into two categories:

- Closed source software that requires buying a license to use

- Open source software that is specifically made for developers and promises to do one job well

Whatever falls in the middle of those two categories tends to suffer, in my experience.

If you think about it, web based software tend not to fit neatly into either category. Most of them are the following:

- Closed source but are either too cheap or are free

- Open source but promises to do way too many things, and also too cheap or free (describes a lot of frameworks and design tools)

Web technology and JavaScript became the dumpster slut of software ecosystems. The end users are not given a big enough reason to pay for them adequately or at all, product owners care little about quality and reliability because it's way too easy to get a zillion low quality users to look at ads, and the barrier of entry for new JavaScript programmers is so low that it's full of people who never think philosophically about how code should be written.


> Web technology and JavaScript became the dumpster slut of software ecosystems.

I think an additional problem with the JavaScript ecosystem specifically is external resources are extremely easy to access and their cost is usually borne by end user resources. Therefore they're too tempting for many developers to avoid. Unfortunately the runtime environment of the end user rarely matches that of the developers and seemingly "cheap" resource access at development/test time isn't cheap for the end user.

JavaScript is happy to pull in some library hosted on some third party service at runtime. For the developer/tester this ends up cached by the browser and/or at the edge of the CDN. A developer may also have topologically close CDN endpoints. This inspires them to pull willy nilly from third party libraries because to them they're cheap to access and they save time writing some utility function.

The same goes for CSS, APIs, or media resources. With JavaScript the delivery is a client issue and costs can be pushed entirely onto the client. If pulling in an external resource(s) costs a developer non-trivial money to store and serve they'll put more effort into tree shaking or other delivery optimizations. They may omit it entirely.

I think this massively contributes to the LEGO piece construction of a lot of web apps. It also contributes to performance robbing things like tag managers that insert a bunch of incongruent plug-ins only at runtime from an unbounded number of sources.


I agree with the first statement and I want to point out that PHP had the exact same issues with its ecosystem. its getting better but not by much.


Wtf is a dumpster slut? Are the product owners in this metaphor pimps?


Ehh... not sure I want to go in graphic detail here, but my understanding is that term denotes cheap and low grade jetsam. I think it was a more common term on the internet back when I was in high school in 2005.


I agree that many Frontend libraries are pretty intimidating to step into if you don't have a background in it.

Don't agree that JS community is bad, it is the largest community of any language by far, and it has the most money invested into it by a huge margin. There is a lot of trash but there is some seriously good stuff, and you can find 10+ packages trying to do pretty much anything you can think of.


When it's "the largest community of any language by far" - which is true enough - having "some good stuff" is a very low bar. The dev culture around JS and Node is notorious for cranking out poorly written libraries.

And yes, you can find 10+ packages to do pretty much anything... of which 8 are abandoned and no longer work on up-to-date Node versions or depend on other packages with known vulnerabilities.


Do all the high quality folks just leave the JS community beyond a certain threshold?


JavaScript has the unfortunate situation of having years upon years of terrible standard library design, leading to people building lots of small libraries on top of those libraries to get things that were basic functionality in other languages.

Then people started stacking more and more things on top of those libraries, creating a giant dependency morass of microdependencies that larger frameworks are now build on top of. And because all these frameworks do things just different enough from each other, every larger library that a dev would want to integrate with those frameworks now needs a specialized version to work with those frameworks.

In most languages, if you want to know how something works, you can usually dig into your dependency tree and by the time you hit the stdlib's optimization spaghetti/language internals, you'll probably have figured out what's causing your snag/what the library is expecting out of you. In JavaScript, you hit the dependency morass and have to give up. Most competent devs then decide to pick another language.

You can write very legible JavaScript these days, even without a framework, but it looks nothing like JavaScript used in a framework.

The other language I know of with this issue is, ironically, Rust.


Please explain the rust angle?


Similar "lots of microdependencies" issue, born in Rusts case from the desire to keep a conservatively sized standard library. It's a smaller problem in the sense that Rust has stronger API contracts as opposed to the absolute disaster that NodeJS is, but in terms of code comprehension you hit a similar dependency morass.

The one thing salvaging Rust for now is a lack of similar frameworks, but who knows how long that will last.


The implication is that Rust's conservatism regarding what is blessed to go into the standard library sets up a similar dynamic.


Or start making their own stuff and ignoring most of the community.


After a couple of years of doing this, you've built up a backlog of your own, bespoke library code that makes you into a wizard. People are amazed at what you can do and perplexed with how little time it takes you to do it.

Nobody else can understand how it's built, but for some reason that's not their problem? It's not like they're taking the time to understand how React is built, either. But as soon as you do something on your own, whoooa buddy. Cowboy programmer alert. It's not good engineering if it's a single, coherent, vertically integrated system. It's only good engineering if it's a mishmash of half-solutions teetering on top of each other.


You are about 4 years behind the curve, everyone uses JS Frameworks that bundle most of the libraries you will need for general dev together now.

I don't understand why people get so up in arms over npm modules, as if you could stand up code that does the same things in another language without having to manage dependencies.


Because most of the stuff in NPM sucks. I'm not going to keep going back to a store that has sold me nothing but shit so far just on your promise that somewhere, buried deep in the back, is a not-turd.


Rubygems does an excellent job of this.


In another language, much of what you go to npm for would be in the standard library.


So you never work in teams with other people again?


This feels like a knee-jeek false dichotomy. But in a sense, it's kind of right. I didn't work in teams anymore. I manage them.

I still do a lot of programming. And I expect my developers to be competent enough to read other people's code and figure out how it works, what it does, how to use it, based on the tests and plenty of extant examples.

I don't want developers who can only be productive in libraries that everyone else's is using/posting YouTube tutorials on/feeding LLM training corpus'.

The problem with adopting other people's software is that you have to make it work for your purposes, all while accepting it was only ever originally designed for their purposes. And if that's open source and you contribute to it, then you have to make sure all your changes don't break other people's work.

But with my own libraries, I can break anything I want. I have, like, 5 projects using them. It's not a big deal to discover an architectural problem and completely refactor it away to a newer, better design, propagating the change to all the others that use it in fairly short order.

And I don't have to argue with anyone about it. I can just do it and get the work done and prove it was the right thing to do long before any Github PR would ever get out of review.


Exactly.


It's more that the low-quality people are way more numerous than otherwise.

For example, in the 2000s and 2010s, javascript had the lowest barrier to entry by far thanks to browser dev tools, so a lot of what you saw was people with no prior experience teaching people with no prior experience. These people are still around, and while plenty have improved, they still form the basis of modern javascript practice.


This is exactly it. There is probably some bad embedded C floating around, but the barrier to entry is higher and thusly that world seems to be a lot more rigorous than the JS flavor of the day.


Embedded C usually isn't much better quality than JS, it's just less public. There's very little overlap with the relatively high quality OSS C available.


Or the person you're replying to is overly generalizing a very large diverse community after extrapolating a few bad experiences.


No but the variance is very high




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: