The system’s been hijacked. The craft of real engineering—building sharp, efficient, elegant systems—is buried under layers of corporate sludge. It’s not about progress anymore; it’s about control. Turning users into cattle, draining every last byte, every last cent. Yeah, it sounds dramatic, but look around—we’ve already lost so much.
I’m running 24 threads at 5GHz, sipping data through a GB pipe, and somehow, sites still crawl. Apps hesitate like they need divine intervention just to start. Five billion instructions just to boot up? My 5GB/s NVMe too sluggish for a few megabytes of binary? What the hell happened?
The internet isn’t just bloated—it’s an executioner. It’s strangling new hardware, and the old hardware? That’s been dead for years. Sure, you can run a current through a corpse and watch it twitch, but that doesn’t mean it’s alive.
> The craft of real engineering—building sharp, efficient, elegant systems—is buried under layers of corporate sludge
No. It is buried under the laziness of build-fast, optimize-next. Except optimizing never comes. Building fast requires lightweight development on heavyweight architectures. And that means using bloats like JS frameworks.
If it takes a programmer an hour to optimize something that saves a second each time it is run, management thinks that is a complete waste since you have to run it 3600 times to 'break even'.
You might think that their thinking would change when you point out that the code is run millions of times each day on computers all over the world, so those saved seconds will really add up.
But all those millions of saved seconds do not affect the company's bottom line. Other people reap the benefits of the saved time (and power usage, and heat generated, and ...) but not the company that wrote the code. So it is still a complete waste in their minds.
Multiply this thinking across millions of programs and you get today's situation of slow, bloated code everywhere.
As improvements to manufacturing tech and CPU designs become unable to deliver the improvements that they used to, the cost of computer time will approach the cost of programmer time. As they converge and possibly flip, optimizations will become more useful (and required) to produce the gains we've become accustomed to. I'm not sure how many years away that is.
I agree. Hardware improvement will hit a wall at some point. From then on, all performance improvement will have to come from software optimization. I do wonder if AI will allow that to happen much quicker.
Could an AI just crawl through an organization's entire code and make optimization recommendations, which are then reviewed by a human? It seems like there could be a lot of low hanging fruit.
No, it is the result of using user's computer to run your ad programm. Noone gives a shit about javascript, as long as it runs on someone else's computer.
Except back when we didn't program like this, it didn't take that much longer. It's the result of shitty technology stacks, like the archetypical Electron. We used to make things right and small at the same time.
Even Electron probably could have been fine if the browser was just a document layout engine and not a miniature operating system. There was an article going around a few years ago about Chrome - and by extension, Electron - including a driver for some silly device I don't remember, like an Xbox controller or something. Googling tells me it wasn't an Xbox controller though. Every electron app includes an entire operating system, including the parts not needed by that app, including the parts already included in the operating system you already have.
Language runtimes don't have to be this way, but we choose to use the ones that are!
> We used to make things right and small at the same time.
Memory is infinite, CPU is infinite, disk space - we don't give a shit because they are all on the sucker's computer.
Just like "your privacy" (aka your data) is very important for us, also your computing power is very important for us.
I wish i was sarcastic.
100%. I had my first obviously AI-written email the other day, and that was one of the clear tells.
I was trying to figure out what made it so obvious, the dashes were one thing, the other things I noticed were:
- Bits of the text were in bold.
- The tone was horrible, very cringe. Full of superlatives, adjectives and cliches.
- I live somewhere where English is the third language, most people don't write emails in English without a few spelling or grammar mistakes.
- Nor do they write in paragraphs.
- It's also pretty unusual to get a quick reply.
Lots of these things are positive, I guess. I'm glad folks are finding it easier to communicate quickly and relatively effectively across languages, but it did kinda make me never want to do business with them again.
On Linux (maybe only certain distros, not sure) the keys are different, but you can enable a Compose key and enable special character keybinds as well.
For example on Mint en–dash is "<compose> <minus> <minus> <period>" and em—dash is "<compose> <minus> <minus> <minus>"
I do a similar thing on Linux with a feature called the "Compose Key". I press the compose key (caps-lock on my keyboard), and then the next couple of keypresses translate into the proper character. "a -> ä, ~n -> ñ, etc.
Are you actually measuring the load-time bottlenecks in devTools?
I don't know the exact details but it appears a lot of sites are sitting around waiting on ad-tech to get delivered before they finish loading content.
great, now I have to un-learn using 'proper' punctuation.
It's AltGr+[-] (–) or AltGr+Shift+[-] (—) on my keyboard layout (linux, de-nodeadkeys) btw.
AltGr+[,] is ·, AltGr+Shift+[,] is ×, AltGr+[.] is …, AltGr+[1] is ¹
[Compose][~][~] is ≈, [Compose][_][2] is ₂ (great for writing H₂O), [Compose][<][_] is ≤, etc.
I use all of these, and more, guess I'm an AI now :(
To be fair I intentionally use — incorrectly by putting spaces around it, just because I hate how it looks without the spaces ("correct" English grammar says there should be no spaces.)
It provides an interesting test case for the usefulness (or lack thereof) of AI detectors:
ZeroGPT: Your Text is Human written (0% AI GPT)
QuillBot: 0% of text is likely AI
GPTZero: We are highly confident this text was ai generated. 100% Probability AI generated
Grammarly: 0% of this text appears to be AI-generated
None of them gave the actual answer (human written and AI edited), even though QuillBot has a separate category for this (Human-written & AI-refined: 0%).
this puts you on a desktop with a full-size keyboard or a laptop with a numpad then, which is a very small minority these days with a definite dev-centric skew.
Indeed I am, but the point here is that some users are actually typing em-dashes outside word processors or publishing/typesetting tools (e.g., on HN)—so it's not necessarily a sign of a message written by an AI (m-dash pun intended). The poster could as well be a developer with a full-size keyboard.
On many Android keyboards you can press and hold various keys to get access to many of the "extra" punctuation characters and fancy "foreign" letters. I imagine the same is also true on Apple phones as well.
On Linux (maybe only certain distros, not sure) the keys are different,
but you can enable a Compose key and enable special character keybinds as well.
For example on Mint:
en–dash is "<compose> <minus> <minus> <period>"
em—dash is "<compose> <minus> <minus> <minus>"
quite surprised this comment got so much debate after i immediately agreed i used chatgpt -or did i?
(u see i dont know how to punctuate , i am not so punctual!)
> I’m running 24 threads at 5GHz, sipping data through a GB pipe, and somehow, sites still crawl.
Aside from websites, we will talk about that in a minute, how is performance? I am running Windows 11 on new hardware and it is running great. I built personal box with 64gb of the fastest DDR5 and a AMD 9900xtx. The most expensive component (and least bang for the buck)... the video card. This is my first time to have an NVMe disk and its absolutely amazing.
I am running Debian 12 on mini computer with much less hardware and its doing amazing there too. I can run anything on the box except AAA games and 4k video movies.
Now, for the web talk. I was a JavaScript developer for 15 years, and yes its garbage. Most of the people doing that work have absolutely no idea what they are doing and there is no incentive to do so. The only practical goal there is to put text on screen and most of the developers doing that work struggle to do even that. Its why I won't do that work any more, because its a complete race to the bottom. If I see a job post that mentions React, Vue, or Angular I stop reading it and move on.
Where did you move from JS? I am trying my best to learn low level stuff to NOT have these situations. But frameworks are bloated already and any optimisation feels useless
I'd have just one final remark, that it really is not a engineering problem but rather a business decision.
After all; having paid gazillions to engineers and project managers to build the sludgefest, all that cash needs to be harvested back into the pockets.
not untrue. though many good projects existed because of passion for good engineering. there is much less good open source now. people want money for their time...
People need to realize that leisure time - time off work, commute, chores - is paid for by their employer (as far as they're concerned). Which is to say that those of us with only a couple of hours to ourselves a day are being stiffed, no matter how much money we make. Stop letting the workaholics dictate how the rest of us live.
People tell me all the time that I should just open source my project that I have spent thousands of hours developing. As if doing so would make money magically appear in my bank account.
It's accepted and known, but in an economy where most megacorps make their money via enshittification, the well-paid engineers who get paid to shovel the aforementioned shit down our throats don't like being reminded of their essential role.
going from 1ghz to 5ghz should make single threads go a little faster ?
IO might be a bottle neck on spinning rust, but we've come far from those days too..
I’m running 24 threads at 5GHz, sipping data through a GB pipe, and somehow, sites still crawl. Apps hesitate like they need divine intervention just to start. Five billion instructions just to boot up? My 5GB/s NVMe too sluggish for a few megabytes of binary? What the hell happened?
The internet isn’t just bloated—it’s an executioner. It’s strangling new hardware, and the old hardware? That’s been dead for years. Sure, you can run a current through a corpse and watch it twitch, but that doesn’t mean it’s alive.