Instead of asking software vendors to make better apps, we ask hardware vendors to put more RAM.
Instead of asking an industry that can distribute the change in months, we ask the industry that will take years to change the entirety of the installed base.
Instead of asking for higher efficiency and better use of resources we already have, we accept bloat as a natural fact of life and accommodate to that.
Instead of maximizing the use of all the resources we have already mined and the energy we have already used, the devices that are functional but slow when used with bloated software, we encourage active obscolecence and further pollution of our environment, exploitation of resources in dubious onditions, more-than-grey working conditions, all in the name of "it is possible to do more, so we should do more".
Software development can either focus on new features made possible by complex inefficient abstractions or can be focused on efficiency in limited hardware.
Focusing on efficiency will always come at cost of reduced features.
As long as hardware is not limiting developers are going to focus on new features giving them the edge, when hardware growth slows , performance will become a factor for competition.
Customers are today paying for features, when they for performance industry will also shift.
Electron has enabled a whole host of apps to become possible despite it's obvious performance limitations. I would rather have those poor performing apps rather than just few perfomant native apps.
The issue with this approach, is whether those "features" are real, actual features improving people's lives or just bs.
Slack is a usual target of derision around these parts, for example. I don't use Slack, but my client has me use Teams which seems just as bad, maybe worse.
I really wonder what those great features of Teams are that have a chat application lag on an 8th gen desktop i5? And I mean lag for basic UI things, like move the mouse around the list of contacts. When nothing else happens on the computer.
I remember when I was younger, at the beginning of the 2000s, I used to chat with friends on whatever IM clients we had at the time. On computers ridiculously slow by today's standards. I used to have a 1.4 GHz Athlon until the late 2000s. Single core. 256 MB RAM. The chat client would never ever lag. I guess we didn't have emojis at the time, but smileys were good enough. I'm not really sure that wasting all these resources on that is such a good investment.
There is a noticeable delay between my key press and the character showing up in Teams. Sometimes the characters don't show up in the right order (how's that even possible?! and yes, it only happens in Teams). That's just text. No 4K screen sharing of an AAA game. No live UHD camera with 32 people multiplexed in ultra quality on a 8k screen or whatever. Text. We used to have this down. What happened? What features do we have now that justify this terrible regression in UX?
You're looking at teams like a chat app, but teams is more like a platform that happens to offer a chat feature. Yes, it does chat, but it also does audio/video conferencing, and document sharing, and contact management, and meeting scheduling, and a bunch of other features. And those chats aren't your grandpappy's text-based IRC, they are rich text, with complete archival and full text search, and a graph API to access all of that. On top of that you have the first party add-ons, like Microsoft Viva, which turns it into a blend between a wiki and a social media platform and an e-learning platform. And then you have third party addons, literally hundreds upon hundreds of them, which integrate all kinds of SaaS products right into the teams UI. Frankly it's a miracle teams and slack work as well as they do, given how much they're asked to do.
The reason teams and slack are so much slower than the chat products of old is because they are not the same product category at all. It's like comparing microsoft word to notepad++. Yes, word is clearly an inferior way to quickly edit a plain text file, but obviously anyone complaining about that is missing the point of word.
A fair question to ask is whether we should have platforms like slack and teams. Why are we rebuilding a tiling OS inside of a webpage, and then wrapping that webpage in an electron container, and then complaining that it gobbles up RAM like crazy?
> A fair question to ask is whether we should have platforms like slack and teams. Why are we rebuilding a tiling OS inside of a webpage, and then wrapping that webpage in an electron container, and then complaining that it gobbles up RAM like crazy?
I agree with you. I think this is exactly the issue.
I get that Teams and the chat apps of old are not the same category, but I think it's a fair question to ask whether this new category is relevant. MS Outlook already had a calendar which, somehow, loaded much quicker than the one in Teams. Even the calendar in Outlook web loads so much quicker than Teams it's not even funny.
I also understand that there are much more functions in Teams (which people may or may not use), but those are all supported by the backend servers at Microsoft, they don't run locally. Teams is basically a web-browser. So I'm not sure that this functionality is an excuse for what are local UI issues.
When I type in the message box, I input text. Maybe some rich text or something, but still, nothing that Word 97 couldn't have handled while being blazing fast on a 486 (aside maybe from emojis, but they can't possibly be the reason why everything is so slow). The same can be said about the lagging animations in the contact list. These all happen locally, with nothing to ask from a complex web service or having to implement some complex function.
My normal desktop has a 240hz monitor and going between that and 60hz is an immediately noticeable downgrade in performance. Why did we accept 60hz as good enough for so long? Even CRTs commonly ran at faster refresh rates. IIRC 1600x1200@90 was what I ran back in the Pentium 3 days on a 21” Sony CRT. We still don’t have useable 4:4:4 4k@120hz on something like a 32-42” screen available.
Anyway, I agree that web tech is easy to develop (not that I thought Visual Basic was terribly difficult comparatively) but there are too many layers of abstraction these days. Can we not make sufficiently advanced compilers to make these really efficient? If not, do we need to go back to tinge drawing board with libuv and event loops? Maybe interrupts and manual memory management really were better. Java for GUI apps certainly hasn’t done much to show that it’s a better solution.
It’s funny but sad how these days, apt upgrade runs so fast, and compiling kernel modules happens extremely quickly, but it takes 10x as long as it did 20 years ago to schedule a calendar appointment with someone because GCal in a browser is so slow. Slack compared to curses-based IRC clients…woof.
Well, yes, there's the display latency issue, but I think that given the state of some programs, the display lag is really way, way down the list of problems.
I have a 60 Hz display, and an IPS one at that, which doesn't even try to pretend it's fast. Still, my terminal window doesn't lag behind when I type. Teams does, every time.
> Java for GUI apps certainly hasn’t done much to show that it’s a better solution.
Meh, I use a Jetbrains IDE almost everyday, sometimes even with Rust and pretty much everything turned on. Even on my almost eight-year-old MBP input never, ever lags like Teams does on a 6 core i7 with a "gaming" GPU. It's not as smooth as a Linux terminal, but it's never irritating.
> It’s funny but sad how these days, apt upgrade runs so fast, and compiling kernel modules happens extremely quickly, but it takes 10x as long as it did 20 years ago to schedule a calendar appointment with someone because GCal in a browser is so slow. Slack compared to curses-based IRC clients…woof.
Exactly. And what drives me up a wall is seeing people being resigned. Some act like they don't even notice this. I guess we may moan until we're blue in the face on HN &c, but if "regular people" don't care, no one's going to do anything about this.
IDE customers care about that latency lot more than slack/teams users do.
A lot of non technically focus look at keyboard while typing, they won't even notice the lag if they are not looking at the screen.
If there is enough customer demand/push latency is not that difficult to solve even on electron, browsers that they based on don't have this latency problem after all.
The market economics dictate features, well built apps are not what is necessarily selling.
Just because its a feature you don't use does not mean its a feature others don't use. I see this all the time in programming IDEs where I don't know about or like the function of a feature but the person next to me swears by it and uses it daily.
> Focusing on efficiency will always come at cost of reduced features.
This is probably sort of true, but it's way worse than it has to be. I feel like with just a little more effort, a little bit of a higher standard, we could get massive improvements.
Electron is great for the problem that it solved. But what if they'd started from day 1 thinking about performance? Maybe they would have taken a year longer, or whatever, but I'd imagine it could be 10x-100x faster/lighter. Perhaps not, but there's plenty of software where that is the case.
The problem then is that someone would release a garbage slow version sooner, and we'd all be using that - and imo this is the more serious issue. When something like electron comes out we need to push back wayyyy harder, the industry needs actual standards.
Things like maximum viable latency for UX - it is insane to me that software deployed on extremely powerful systems can have 10s of milliseconds spent on something like typing a character, or even longer! But we have no quality standards in this industry, so it's I guess hard for everyone to say "this is below our standard".
I like your comment, but I think Electron is only part of the problem. IMHO, the bigger part is that all these apps are built using JavaScript and all the SPA style frontend frameworks. Heavy Javascript-based UIs aren't slow because of Chromium or V8... I wouldn't blame it on them. It's the frontend frameworks and libraries, their design and abstractions, and then as a consequence, how they are used by developers.
Yeah, I agree fwiw. I think that most engineers have a very weak idea of "fast" vs "slow". I've met people who think JS is really fast, or that Go is like C++ level speed. And we wonder why software is so slow.
Rhetorical question: how much e-waste do you think Wirth's Law is responsible for?
I personally don't think that people should ever have to buy a new computer until their use-case changes. At most, they might need a battery replacement after a few years. The idea that people need to replace a machine every six years to keep doing the same tasks is backwards.
I am yet to see an Electron app do something that wasn't already done by some older app before it with less lag. All I've seen so far are various flavors of the same basic chat/conference features in slightly different pricing models. Or apps that exist just for the sake of claiming something has a native app( while being just a wrap around the website).
This is true too. More and more hipster electron apps these days wallowing in RAM on our constrained systems. And we don't have a choice. I can't choose not to use MS Teams at work. And the web version can't do audio/video meetings on Mac.
One thing I don't understand: Microsoft does have the knowledge to make a lean Electron app. Look at VS Code... But somehow they don't bother applying their lessons learned to Teams :?
Electron can be lean, but it must be difficult somehow otherwise every app would do it.
They're essentially the same app; the desktop version uses Electron. There are a few desktop-only features but the desktop application is mostly powered by inefficient web technologies.
And what actions do you propose? Making less efficient code is a result of faster shipping, less experienced developers, suboptimal tooling/languages and many more other reasons. And the current situation is the real one, the one in which we actually ended in after all the acters did what they did. To change it, a lot of money or developer open source good will has to happend over a longer period of time. I don't like posts like yours because they picture the situation as if someone specifically wanted this situation instead of no one wanting to spend even more of their own resources to make just a small improvement in the ecosystem.
I mixed up everything a little bit but I'm more complaining at the post's authors. They don't necessarily decide where we're going but they definitely have an influence on the general discussion. But I don't think they're siding towards ever more growth out of nowhere; they do it because that's what most developers actually believe.
That's where we, as developers, have something to do. If we disagree, we have to voice it to make it more visible in the public discourse. The high-end configurations are built for us developers first, so we as customers have this leverage.
There are some excellent app developers out there that I don’t mind forking $20 bucks over for a “simple” piece of software such a duplicate file finder on Mac OS X, but with many software offered for free there’s a problem of the tragedy of the commons ( Electron apps are a prime example )
I teach a programming course and way too often I see students badly in need of more memory. Like, saving a text file taking minutes (I'm not making this up), and the student considering that normal.
To these students, I recommend simple text editors such as notepad++. But nowadays with Zoom and Teams consuming all available resources even on more powerful laptops, I feel all hope is lost.
Although often it helps to use the web versions of, say, Goto Meeting or Outlook, instead of the app version. Tragic as that is.
> we accept bloat as a natural fact of life and accommodate to that
I don’t disagree with your post, but what is the solution here?
Your average consumer likely doesn’t understand the specific role of computer RAM, nor do they understand that software they’re using excessively wastes it. How can they accept or reject something they don’t know?
And it’s a similar problem with many developers. They don’t think about memory, or they view it as some infinite resource pool.
I see folks often times using complex “frameworks” to solve trivial problems. They merely wanted to 2 + 2, but that framework has a dependency graph for solving systems of equations and partial derivatives.
Democratization of tech and programming has resulted in a lot of inefficiencies. I mean - that’s the cost, right?
> we encourage active obscolecence and further pollution of our environment
I don’t think your typical consumer encourages this at all. Between their jobs, families, and all the stress of day to day life, Im not convinced they even have time to think about these things.
> I don’t disagree with your post, but what is the solution here?
If you provide hardware for developers who build software with a direct user interface, don't get them the best. Getting them the best for build machines, so they don't have to wait around for builds makes sense, but make sure the machines they run interactive tests and personal usage on are not the best.
When your developers have 16gb or more ram with 8 cores and a fast SSD and your users have 4 gb, a dual core 15w cpu, and a spinning disk, it can be hard to understand the sometimes huge difference in experience. And sometimes, it's actually not that much work to make things better for users with low end devices. The difference between high end phones and low end phones can be even more stark.
On my last team, we would do QA on devices that ranged from “decade old + notoriously buggy implementation” to “new hotness, not even available to consumers yet”. The latest greatest stuff was more useful so the mfgs could understand what wasn’t done right there, but the old stuff is where our SDK team found the most value — if it ran fine there, it was pretty much guaranteed to be awesome everywhere else. Faster devices would just mask things like frame drop / memory leaks, but those good old low-end old devices were great for detecting problems and perf regressions.
An anecdote that has long stuck with me from a bicycle mechanic back in the rust belt: ~“Anyone can make new stuff work well, but not many folks still have the patience and expertise to really make the older stuff shine, but it can and it does with a bit of care.”
For work, I upgraded from an 8th Gen Intel 6-Core i9 laptop with 32GB@2400mhz and a 1050ti to a 5950x / RTX 3090 / 32GB@3600mhz with tight timings and it’s very noticeably faster for general desktop usage and development.
Facebook implemented "2G Tuesdays" (https://www.nbcnews.com/tech/social-media/facebooks-2g-tuesd...) where developers would get 2G speeds when doing anything related to the website. That gives them the experience of people with very poor connectivity and probably improve their experience. Just like you said, if it runs OK on 2G, it can only be fast enough on 4G
In that they restrict you from screwing around as root and otherwise aren’t “ridiculously restrictive” compared to Android in any way? You're right, they are.
Like always for consumer protection, the only recourse is legal action. Ask your govt representatives to make slow software illegal. Carbon tax on inefficient code. IIRC Germany has started looking into it ?
I think this could be a can of worms that’s more damaging.
What if the software is completely usable when it’s slow? What if it’s slow by design? That’s not outlawed?
And what’s the metric for being slow? Latency? Latency across what dimension? Who measures it and policies this exactly? Or do we slap fines based on resource consumption? What if my software just runs slower on the JVM if I’m on Java 11 instead of Java 8? What if that problem isn’t my fault but because someone in my dependency graph has a regression? Oh they open sourced their code. Am I liable now because I have a dependency on it and upgraded to the new version?
What if it’s network conditions that are causing the slow performance. Am I liable? Is your internet provider liable? What’s the minimum bar for performance, across what dimensions, and who defines it? And where do we find people with the education, background, training to define such a vast standard? How do we ensure it’s fair? Maybe I can’t write as performant code as the well funded people at Google or Amazon. Am I just shut out of the market entirely? Or is it applied selectively based on market cap of a company?
I've read about manufacturers limiting RAM to e.g. 8GB to be able to stay within some power-consumption deemed acceptable for a notebook device. More RAM draw power.
Given that RAM size can have those ramifications energy-wise, it sounds almost like it would be a win, to go against the grain of the article and push for an upper limit of memory, say 4-8GB, if the environmental factor was the lone metric to judge on.
Secondary effects follows due to restraints, software makers who doesn't want to see users churn from their applications start scrambling to optimize codebases so the executives won't loose out on their bonuses for that fiscal year.
It is not a technological issue, but a social one - the incentives to create consumer-friendly software are not good enough so companies settles on the lowest possible denominator.
I mean in a lot of cases the pro, as in professional, version of something baselines with 8gb of ram but I can’t remember the last time I’ve worked with a dataset or problem that neatly fit within that. Maybe it’s not the way to go for consumer products but I find it unacceptable in a professional’s machine for a lot of modern computing problems.
I think it's kind of the path of least resistance to improve hardware instead of software. A small group of individuals making incremental hardware improvements is much easier to scale than having the entire software industry make optimizations to use less resources. It's also generally a lot cheaper to throw money towards new hardware instead of towards software optimizations at smaller organizations. For larger organizations, that calculus changes and the ROI of performance optimizations becomes more attractive. Some of those optimizations make it upstream to open source projects, but a lot of it is one off work that only benefits that company.
162 of the lex podcast with Jim Keller (a hardware god) dives into this well.
Right, making software leaner for small organizations doesn't give a good return on investment for the effort applied. But for larger organizations that have a massive compute bill like say Netflix, the return on investment of software optimizations across that massive compute footprint saves millions of dollars. So it makes sense to have performance engineers like Brendan Gregg on the pay roll. His book on systems performance is ridiculously good and is useful for smaller orgs as well when systems performance degrades to the point where the customer is impacted.
We do tend to maximize device resources in mobile development. The default application type is implemented in native code, battery life is a HUGE consideration, which is really a proxy for CPU, IO, GPU & network efficency and binary size is correlated with how long people keep your app installed. Usually memory usage is very good.
It's just desktop development might as well be legacy development, and the vast majority of people only use their web browser and a few office apps at most. Except games & command line apps, which are also developed in an efficient way, how many apps do you have on your desktop device. Probably barely any compared to your mobile device!
What is being optimized is developer time, productivity and time to market, because if you haven't noticed in the last 15 years, developers have gotten very, very expensive. Inefficent software is purely a business decision. Make something faster than web dev that gets delivered over browsers (flutter is a good candidate) and watch as all new apps all of a sudden become that.
It's not about the monetary cost, it's about the model of society: this model is about consuming always more, making "bigger" machines with "bigger" components, because it is possible. But it doesn't question the environmental cost, it doesn't ask why we actually need bigger machines to mostly do similar things as what we did some time ago.
A burger with beef will cost roughly the same as a vegetarian burger. Even if nothing can replace beef, one should question whether there is a need for beef and if it could be replaced, because the alternative emits less CO2. Same goes for RAM.
Yeah right, mostly do the same, but then your phone records video at 400MB/s and then finds and tags everyone's faces that you can easily find at single tap...
I was taking a peek what Dell as an example has currently in market. The most ridiculous configuration maybe is this "Precision 3560" in its baseline configuration, which includes latest generation Core i5 CPU paired with 4 GB of RAM, with almost $1200 pricetag. I feel sorry for anyone who buys that..
I would agree that this skimping on RAM is pretty ridiculous. I can understand if some <$500 laptops still would ship with 8 GB, nobody should expect any greatness from those, but for devices costing three times more the expectations are higher.
Heck, my main PC is i5 (so mid-range) from 2015 and even it has 16 GB of RAM.
> The most ridiculous configuration maybe is this "Precision 3560" in its baseline configuration, which includes latest generation Core i5 CPU paired with 4 GB of RAM, with almost $1200 pricetag. I feel sorry for anyone who buys that..
Or maybe they're being smart by adding RAM themselves rather than paying dell an overpriced markup?
The Dell Precision mobile workstations have a good price when compared with other mobile workstations.
All mobile workstations, regardless of vendor, are overpriced in comparison with gaming laptops, because those who buy mobile workstations are forced to buy them for a few features that are missing in gaming laptops.
As another poster has already written, a common strategy when buying a Dell Precision mobile workstation is to buy them with the minimum RAM of 4 GB and with the minimum SSD of 128 GB, with the intention to dump the original RAM and SSD and buy something like 64 GB of ECC RAM and a fast 2 TB SSD, at much lower prices than from Dell.
The only reason for the 4 GB RAM configuration is because Dell will not sell a laptop without RAM or without a SSD or HDD.
The economy from buying yourself the RAM can easily be up to one thousand USD or even more, even after paying for the mandatory 4 GB.
Never understood why they don’t just offer RAM at closer-to-market prices instead of giving you a 4GB stick that will get tossed or sit in a drawer unused. Surely they’d rather make $20 on RAM instead of $0? It’s not like they refuse warranty service for unrelated parts because you’ve installed aftermarket RAM. I’d bet they could even get preferential pricing on qty purchases, so in theory it should be cheaper to get RAM / SSD upgrades through the mfg vs aftermarket…
Not everyone's tech savvy enough to order and install their own RAM. For every person buying 4GB of RAM to upgrade later there's probably 10 buying 16GB RAM at 100+% markup.
I'd guess most people buying mobile workstations would be savvy enough. But if your company requires the laptop is bought from and gets service from Dell why wouldn't they just pay the premium price?
Having worked in a start-up where everyone 'got their way' with respect to hardware (company culture thing) it was impossible to centrally organise things like repairs - nobody but the dev knew where stuff was ordered so technical defects took up lots of dev time which costs more than just paying a few $100 extra for peace of mind.
They'd like to make some margin selling computers. The cheaper base model machines are very competitively priced between manufacturers. Much easier to make your profit on options.
Oh, sorry, did not know this was their mobile workstation line. Makes sense, then.
To be honest, considering we're talking about Dell, I must say I'm nearly surprised they did not put a DRM in the BIOS/UEFI to forbid using more than the quantity the laptop shipped from without paying a license. I assume it's that then, all the customers that do buy the cheap lines and update the RAM themselves would go to other shops ?
Dell is actually one of the best large PC OEMS in my opinion because they dont play those games (at least not on their business focused and higher end products). Their UEFI firmware is pretty good, and updated regularly even on old machines (they also support fwupd on Linux, which is a must for Windowsless systems IMO).
Like many “irrational” pricing schemes, it can be explained with price discrimination.
If you’re cost sensitive, you upgrade yourself.
If you’re paying an overworked IT department to buy laptops, you pay the premium to Dell instead of hiring an employee to deal with upgrades like this, and then hire another person to scream at multiple people (instead of one) when all the laptops start BSODing because of some Microsoft update combined with Murphy’s Law.
1. Better to pay a $200 markup than a $400 markup right?
2. the markup is often warranted. There's a vast difference between "business" laptops and "consumer" laptops. I can't speak for dell's precision line, but lenovo's thinkpads are leagues better (build quality, support, etc) than the consumer crap you get at best buy.
The thing that bothers me so badly is that if they just took all the resources they put into RGB-everything development and “gamer” branding and put that into tactile improvements, better trackpads, more solid hinges, less chassis flex…I bet there’s a big market for gamers who want the function over the form. It doesn’t have to be a beige box either, I just want something like a Dell XPS with TotL GPU + CPU. For some reason, that also necessarily comes with ugly chintzy obnoxious hardware design. I feel like PC HW mfgs are so out of touch with what people actually want. Look at the NVidia RTX 3080/90 FE cards and how they’re actually classy and solid, without a giant fan shroud and 6 additional cheap loud LED fans. It’s not like they had any problem selling every single one they made, just sad that it lives inside the case.
Razer is one kinda-exception that I can think of, where they have fairly solid laptops that can be more minimal, but the build quality is still not quite up to mobile workstation standards.
TL;DR I wish I could get a MBP with TOTL hardware instead of having even the highest-end config still shoot for thin-and-light.
Lenovo kind of does this with their Legion laptop brand.
Those are sold for the "gamer"/"enthusiast" markets, so they don't have the premium support you get with pro ones (Though you can get an upgradee where they dispatch someone to your house for repairs. I think it costs something like 80€ (~100$) in France ?), nor the premium features (Not sure you could track the laptop if stolen, no ECC support...)
But on the other hand, they are clearly thought more as "hybrids" than other gamer laptops:
- The UEFI lets you decide if you want to enable/disable the intel chipset (Useful when you want to be sure everything goes to the GPU/want to use things that don't work well with NVidia's Optimus)
- Lightweight and easy to put in a bag. Not one of those 5 inches thick laptops that weight 5 kilograms
- The look is made not to scream "gamer laptop !".
- Wide configuration range for a single chassis.
Also not that expensive, even for consumer-grade standard. Got one with i7-9750h/RTX2060/16GB/512GB NVMe for 1100€/1350$ one year ago.
>can understand if some <$500 laptops still would ship with 8 GB, nobody should expect any greatness from those,
My Acer aspire from 2015 or so cost me $450. It came with 8GB of RAM the RAM is also upgradeable. I could swap them out for 16GB. Unfortunately the video card in it is trash.
Equipping the machines with a relatively small and non upgradable amount of RAM is probably the easiest way to achieve a sufficient probabiliy that these devices are thrown away by their users after some relatively short period of time that best suits the manufacturer's business model.
All enabled by the ever more RAM hungry bloatware that gets pushed to users' computers unless they're painfully selective about the software they install.
It's probably also just marketing ... having some ridiculous 4 GB equipped laptop in the store that's only few hundred bucks cheaper than the 8 GB model, makes it way easier for many people to buy the latter. Even if both are stupidly over priced.
In reality, I believe we're being let down by software developers who consider hardware resources like RAM and CPU as a free and unlimited resource. This includes web development.
Electron apps like Discord/Slack/MatterMost/VSCode/etc. are a good example of this. Or Windows 10 versus Linux (or even macOS).
This comes partly from the cloud-centric approach of "just scale up" instead of trying to optimise code. Lots of developers just want to throw more CPU/RAM at their inefficient code because "it's cheaper than the developer time to fix it" which is sort of true up to a point, until you have to hire a contractor to come in and reduce your AWS/GCP bill so you don't go out of business.
Honestly, web development has been exponentially worse than any other field of software development I have exposure to, including games. 90% of the time when I open task manager to figure out why things are so slow the answer is some Chrome tab. gMail is actually the culprit a shocking amount of the time. According to Chrome, my gMail tab is using more memory than Slack and Outlook combined and is an order of magnitude less responsive and reliable than either of those apps from my experience.
Absolutely, the bloat has grown far faster among web apps. A modern VR game with many GB of assets handling vastly more difficult computations at super low latency can often still fit in 8GB of memory.
It's mind blowing that trivial tasks like editing a google sheet document have come to consume even remotely comparable resources. Google sheets as it existed 8 years ago was snappier and more responsive while using far less memory.
There seems to be more competition among games than "productivity" software when it comes to performance. It begs the question of how bad things will continue to become before it starts affecting the bottom line.
VS Code gets a pass, it is a quasi-IDE, Electron or not, it is a heavyweight. Tough I still use Sublime Text for performance reasons.
Discord and Slack: no excuses. These are chat apps running in the background. There could be a client taking just kilobytes it there was a will for it.
Almost like those reasons for using Electron about developer productivity are really exaggerated. I wish they would invest in a few people to help optimize performance to improve UX, as it affects every one of their users.
That ridiculous purchase is fully on you, my friend. You would have been absolutely more than fine with 16GB, or even 32GB if you're scared.
But given that quadro, I feel like this wasn't at all a protest, but rather you do some sort of resource heavy work such as 3d simulations or rendering. Yes, these pursuits have always been resource intensive. No, it's not the fault of electron.
My 32GB, 2018 vintage work laptop is reporting 27 GB used, 4.13 cached, and 3.9GB swapped out.
I have two web browsers, VS code (remote mode only), and slack open. About 4GB is being used by corporate crapware. I use Firefox as a json viewer, so it’s using 0.5GB. Emacs is using 50MB and my imap client is using 260MB
Various UIs frequently hang waiting for swap. 32 GB is not enough, even though I’m using it like a dumb terminal.
I guess you're using Linux? Memory management for desktop applications is really sub-par compared to macOS or even Windows.
My 16Gb mac has zero problems with a comparable workload.
Nope, I don't do anything specialized, other than running some virtual machines while debugging and CI/CD pipelines. I've never exceeded 24 Gb of RAM in usage. At some point, I would like to run an small cloud infraestructure on this thing.
> Ultimately, though, we all need the laptop makers to be better about configuring their machines. It’s unacceptable to charge $1,000 or more for a laptop with only 8GB of RAM now. The alternative is to stop soldering it to the motherboard so we can upgrade it ourselves, but that’s not going to happen, is it?
Machines should be upgradable/repairable, I think the right to repair also means that the end user should be able to decide how much memory a machine has (within bounds of course).
How will this work? I don't know. There are at least two instances where I didn't buy a new laptop because the memory system was either hobbled, topped out too soon or was not fully upgradable.
I think it is a larger driver of ewaste than people realize and it prevents older machines from being repurposed as easily.
Even with soldered ram, it isn't out of the question to get the machine sent off and have the memory upgraded, it just requires the right tooling.
Soldered down RAM has real technical advantages. LPDDRx variants of RAM aren't available on DIMMs for this reason. All that modularity means you have ~100 very high speed digital signals running across a board and several connectors. That takes power, due to the physics involved. It doesn't matter for computers plugged into a wall or for gamer laptops, but it matters in the mid and ultraportable segment if you want decent battery life.
Soldering RAM down right next to the CPU or on top of it (as is done in SoC style laptops) lets you use narrower, faster buses at lower voltage swing and lower power (LPDDR4X operates at half the I/O voltage of DDR4).
Now soldered down SSDs, yeah, unless you're building a thin phone with no space for a connector, there's no excuse for that. PCIe works perfectly fine across a connector.
What is the contribution of the bus to overall power consumption vs the CPU and DRAM chips themselves, let alone the screen? I find it hard to believe that that is really an important design consideration leading to soldered-in RAM.
I had a Lenovo Stinkpad Carbon X1. The soldered-in RAM started to fail, causing memory corruption. That machine is now a brick, and I am still pissed off about it.
Yes to this. Me: I want a easily upgradable, swappable computer. Also me: I want a fast 2 pound laptop with great battery life.
That said, I had to put a new battery in a 17" gaming laptop a while back. It's a huge machine and there was zero excuse for the battery not be easily swappable.
> it matters in the mid and ultraportable segment if you want decent battery life.
Memory buses consume very little power compared to the rest of the components. Even memory itself doesn’t consume much, except VRAM in high-end GPUs but that’s an order of magnitude faster than system RAM in most computers.
Battery impact of soldered versus socketed RAM is negligible. For instance, HP ProBook 445 G7 and G8 have two DDR4 SO-DIMM slots, yet the battery life is up to 14 hours.
Computers with soldered RAM are slightly cheaper to manufacture (less parts). Also, it’s not user-upgradeable, users are likely to replace the complete computer after a few years. Memory and disk requirements of software grows quickly over time: https://techtalk.pcmatic.com/research-charts-memory/ These two factors create incentives for computer manufacturers to solder the RAM, despite that’s not in the interests of users, and the environment.
To you an titzer, as much as I don't like non-modular RAM, what I am advocating for goes even beyond this.
Allow the soldered on ram to be upgradable, or put vacant spots for more ram to be added later.
Allow HBMs to be stacked, or sell a machine with the HBM not attached and I can source one myself.
There absolutely could be a socketed standard for memory.
But in general I am arguing for a more upgradable philosophy and manufacturers should be obligated to adhere to that philosophy. The technicals will always be changing, it isn't really a technical problem at all.
We already have standard pinouts, etc for DRAM. However, desoldering and resoldering in new chips is error prone, requires special equipment, is labor intensive and can damage nearby capacitors, etc.
Anyway, if you want, nothing (except economics) is stopping you from upgrading your soldered DRAM.
The fact that new computers are already unfixably obsolete can't be good for business. Whatever they are saving on components and assembly can't be worth it if nobody will buy the machine in the first place.
My work is cheaping out on laptops and also the vendor they buy MacBooks from only wants to do off the shelf SKUs and not CTO (Configure To Order). So due to Apple's crazy low base RAM I have a MacBook Pro 2019 with 8GB RAM.
Literally every day I get this panicy popup saying I MUST close something or the system will reboot. Often I get random crashes. The "Memory Pressure" in activity monitor is almost always red.
I don't know how Apple can advocate selling a "Pro" system with only 8GB RAM while making the mandatory upgrade to 16 really expensive and difficult (much longer leadtime on CTO models).
I’ve literally not once got one of those pop-ups you described on my 8GB 2015 MBP. Not saying your point is invalid, just saying there’s a lot of professionals that don’t require more 8GB of RAM.
Strange enough it's mainly Outlook that seems crazy memory hungry. Usually it's using about 3GB by itself alone.
I've taken to using the web version a lot but it's not ideal. Another big memory hog is Firefox. I don't use Safari because macOS isn't the only platform I use.
True, I have very little free disk space too. Because my work also cheaped out on storage with the 128GB model. XCode alone is 30GB and Outlook takes up about 20 as it downloads my entire mail library by default (and there seems to be no way to tell it to just sync the last 2 months and keep the rest on the server).
> What’s actually using all that? Background processes from first and third-party software and from Windows 10 itself. When you click the cross to close Discord, for example, it doesn’t actually close. It operates in the background and doing so requires using system resources.
I think Discord, Skype, Telegram and co are made with electron thus the memory issues. Great for iterating, horrible for cheap laptops.
The article itself isn't really interesting though, because not really technical as to what the issue is exactly.
I’m a huge fan of Telegram’s model of releasing a cross-platform core for the community to build their own clients on top of. I wish all chat apps followed its lead.
hauls our old grindstone of GChat refusing to do XMPP federation
Pretty sure this was the start of the downfall. If we had a time machine, go back and convince the Google and Facebook PMs / mgmt who made the decision to not try to extend / support XMPP, maybe the world would be different today. I wonder if one could quantify how much global warming was caused by those decisions.
Interestingly, the latest Macbook Air has only 8GB of RAM. I've used it about 3 months now and haven't felt any issue with perf. I don't think my experience would have been any different with extra RAM.
Bought the same machine for my daughter in a sort of emergency situation. I thought I’d be returning it and ordering a higher spec machine, but never did. She plays games, runs Discord, has browser tabs open, watches YouTube, edits her own videos, etc all without any issues. And the battery is crazy good. I’m actually a little dumbfounded. It seems Apple has found a good formula for entry level machines in the new MacBook Air.
I bought a 16GB M1 Air to try it out and I’ve barely touched my 32GB MacBook Pro since. It’s usually at 11-12GB, even with plenty of multitasking (including 3 browsers and Adobe PS/ID/AI). You get spoiled fast by the responsiveness/performance/silence. I can absolutely see the 8GB model being fine for the bulk of users, especially if they’re running mostly Apple Silicon-native software like MS Office.
I have been using the base model 8gb m1 Mac mini as my main work machine since it came out, the only time I have noticed a ram limitation was when I ran out of storage space and it couldn’t swap anymore. Pretty amazing since this computer cost $2000 less than my 2019 MacBook Pro and is noticeably faster (Clojure development and some hobby music production)
Maybe I’ll regret it in a few years if the ssd dies due to the extra load though… I do really wish the storage was easier to change
Both iOS and macOS support compressed memory pages which helps a lot with memory pressure. On modern hardware decompressing a memory page is orders of magnitude faster than swapping.
Even relatively modest compression ratios add up in aggregate. Swapping compressed pages is also more efficient (when it happens) because less data needs to be persisted to the disk.
That's not to say 8GB is enough for everyone but it punches a bit above its weight in a lot of use cases.
I think it depend on usecase, my main laptop at home is an 8GB MacBook Pro from 2013, and it’s completely fine for my use.
If non-technical friends and family ask for computer recommendation I just tell them to get the cheapest Mac. If they want a PC I noticed that they’re normally not to happy about the price if laptops I’ll recommend. You’ll still find Windows laptops with 4GB of RAM (and a terrible screen). The main selling point is the price, $1000 or less (note that’s Danish prices including 25% VAT).
Can confirm as another M-1 Macbook Air user that this thing seems pretty snappy with just 8 GB of RAM but my work provided thinkpad seems to be much worse with 16 GB.
Yeah I’m using the same machine. It depends what you do. I haven’t hit a memory pressure warning yet. 99% of what I’m doing is either on remote systems or in vim and safari. My MacBook is a futuristic portable vt terminal and iPod and 8gb is fine for a that!
I also didn't feel any issues on last 8GB MBA (pre M1) until I started feeling a year down the line. Everything was running slowly after couple of updates including Firefox.
In my anecdotal experience Windows/PCs require decently more ram than OSX/Macs for the same perceived performance. My Macbooks were always fine with 16gb but I had to upgrade my Windows machine to 32gb despite using the Macbooks for much more involved workloads.
I'm not a journalist. Compiling with npm, cargo, go - all of it works well. Also run Chrome in the background. But on this particular laptop, others have run video processing/other compute/memory intensive tasks and it's held up well.
It seems like the author doesn’t understand how the OS manages memory. It is good to have a lot of memory used. You don’t get your money back for the RAM you save.
The only worthwile metric is how fast stuff actually runs/noticable wait times on swapping, etc
The OS uses the ram as this makes running tasks faster. No point in keeping it free.
>The OS uses the ram as this makes running tasks faster. No point in keeping it free.
this meme[1] needs to die. windows, mac, and linux (htop) all correctly differentiate between "used but used for cache" RAM and "used for application" RAM. When people are talking about not having enough free ram, they're talking about chrome, spotify, and slack taking up 2GB each, not the output of "free" showing only 8MB free.
In most OS's, all available RAM is automatically used as a file cache, and is not considered "used" in system load stats. Available RAM is absolutely not "wasted", and tasks which use more RAM actually make the system slower since memory bandwidth is the likeliest bottleneck in modern systems.
These were my thoughts, too. Most RAM these days goes into caching apps and files that might be needed. Inverse swapping of sorts.
Is it possible that the problem is worse on Windows because of the DLL zoo, with each app shipping its own, whereas the Mac makes heavier use of shared Frameworks?
This is all based on windows. I use kubuntu, 20.04. have 12 GB of ram and right now with total of 222 processes, firefox and clementine open, I use less than 2 GB of ram. so, no, not if you want to run some decent linux, but hey if you wanna run windows.
I won’t argue that many won’t need 16GB of RAM, because I don’t know what people use their laptop for. I will argue that if you need 16GB because a chat application and a browser already pushes the system above 8GB, then something in the software is broken.
Some wrote on their blog, sorry don’t remember who, if Electron apps (which I assume the Telegram client is) uses to many resources, then don’t argue that it should be a native application, instead ask if we can make Electron more efficient. That’s a smart idea.
Same here: KDE and Arch, using 1.7GB (I don't have much "real" stuff open yet). Fresh login is just under 1GB.
I also use Windows (desktop gaming machine), and the main problem with it is the amount of crapware that Microsoft is starting to bundle. It was once the claim that Windows had a bad name because of all the junk that OEMs pre-installed. Well, Microsoft themselves are doing it now.
Kubuntu with "Plazma" was the most RAM I have ever seen a Linux distro take at idle. Xubuntu was a breath of fresh air in comparison and had far fewer weird UI bugs. I don't recommend kubuntu for resource constrained systems.
RAM/resource usage has been steadily decreasing for Plasma for a while, and has basically reached XFCE levels, so I suspect your information may be out of date. On my system it's currently using 250MB. Xubuntu is a great DE recommendation, though.
Just the GPU surfaces for hardware acceleration will use a ton of memory. If your GPU has dedicated VRAM you might be OK, but often there needs to be a CPU-side buffer for the data as well.
16 not enough here: I'm closer to 20-24 because mostly of lots of browser tabs and switching between projects, rarely because of heavy stuff. 32 would be comfortable today, and 64 would be future-proof (my 16GB is 6 years old now and I plan to keep the next one as long!).
Alright, I have ~120 browser tabs open in FF but most are hibernated. Sometimes I open a lot of tabs in another browser for... ehm... other reasons... Never a problem. And no loading of huge projects in vim. MBP 16"
How do you do it? Do you have a Sql server, MySQL, or postgresql instance running on your machine? Do you aggressively limit how much memory these servers can use?
My work x86 Mac laptop that they gave me last year - from the well known tech company that believes in “frugality” - has 8GB of RAM. As a developer[1], I was at first shocked seeing that I came from a 60 person startup that at least gave me a 16GB Dell.
Then I realized that since I have the ability to spin up any imaginable resources I need at the click of a button - or by writing a shit ton of yaml - without having to worry about the company getting a large bill from our cloud provider like I did before, I don’t need to run everything locally. I can just open my own AWS account.
That being said, I was shocked I only occasionally experience slowness with VSCode, Slack, Outlook, and Chrome running. Occasionally I will have to use PowerPoint or Word. On the rare occasions that I do need a beefier Linux or Windows environment, I use Cloud9 or WorkSpaces.
At home with a 1GB up/down connection it isn’t too bad. I don’t know how inconvenient this will be if we ever start traveling. I was hired post Covid.
[1] While I spend most of my time developing, I’m officially a consultant in ProServe. I would hope that the SDEs get something beefier.
8 GB of soldered-down RAM on an Intel- or AMD-based Windows 10 laptop is a cost-cutting measure, though it's plenty for most Linux distros as a daily driver. Don't be Poor(TM), spend the money on 16GB (or with a manufacturer that offers you that at a reasonable price).
Apple M1-based laptops are fine with 8 GB; it's an, errm, Apples to oranges comparison.
I'm not sure about the claim that Win10 itself uses a bunch of memory. On my machine with lots of stuff installed, almost all of the "Microsoft Corporation" processes running are lightweight in the 2-10mb range. Many of them could be evicted or swapped out. A few of the heavy ones are suspended UWP apps, which the OS will kick out if memory is limited.
The three outliers are MsMpEng (windows defender AV), SearchIndexer, and Explorer. The last one makes sense given how much stuff it's tracking (all my open windows, my open folder views, tray icons, etc) though I feel like it should definitely be lighter.
SearchIndexer and MsMpEng use a ton of RAM and I don't know why, that feels inexcusable. Maybe it's mostly cache pages that are reported wrong and will be evicted if memory is low?
Out of all of those, though, they predate win10 by a while. The same stuff would've been running on 7. So I'm not sure that things have gotten worse - I suspect it's all about the third-party software installed on the machine, and people have tons of third-party software installed now.
(And if you need to, you can disable Defender and SearchIndexer. Maybe you should.)
In the end most of my memory is going to third-party software (Firefox, qBitTorrent, Dropbox, Steam, LINE - all 400mb or more) or optional services I have turned on (Windows Update's "delivery optimization" will sometimes chew up a bunch of RAM if you have updates waiting to install - it seems to be caching them in memory to distribute them to other machines on your network. I wouldn't be surprised if this will get thrown out in limited memory scenarios, but make sure it's turned off!)
Every Electron app you have open is probably 400+mb on Windows at best, and often worse. (Electron on mac uses different APIs, which may help?)
Maybe the reason all these new M1 laptops feel great with 8gb of RAM is that mac users don't need to install as much third-party software, and Electron apps use less memory on OS X? I don't think most macs have a virus scanner installed, for example.
>As I sit here writing this I have two apps actually open in the foreground on my PC; Firefox and Telegram Desktop. And according to Task Manager, I’m using 55% of my available memory.
Author has a severe misunderstanding about how RAM is "used" actually allocated.
The exact same system he has but with 8GB RAM would probably be at 70-80% ram usage or so.
The reason why he reaches 8GB+ on his machines so fast is because he has 16GB. Nothing is more useless than empty RAM (in a PC at least).
The OS does and should use as much ram as possible if there is plenty RAM left. The system is fast if the RAM is used not if it is free.
I want a laptop with 64 or 128 GiB of Chipkill ECC RAM. Any of this soldered-on, anti-R2R/-consumer Apple trend-copying is BS. This crap about "YAGNI" or "it costs too much" have no idea what they're talking about and don't understand the reasons or use-cases.
At home, I may have a 96 core EPYC with 512 GiB of Reg'd ECC RAM, but it doesn't have the locality or reachability of something that's with you. Network presence and low-latency aren't a given.
DEFCON 19: Bit-squatting: DNS Hijacking Without Exploitation (trivia: at the time of filming, I was excessively-hungover and laying across 6 seats in the middle of an empty area to the extreme right, from the podium's perspective. I did watch the presentation with sunglasses on indoors because my eyes weren't cooperating nicely with photons.)
The author is talking value, the PM's are talking pricing strategy. 8G is under-powered, thee point is to anchor a price and to get you to pay more. Sadly, especially with Macs which are not upgradable.
There is a place for anything. My wife uses for work an older 4GB Surface Pro; my father is also using a laptop with 4 GB of RAM. I am using the company provided laptop with 8 GB of RAM and a desktop with 128GB of RAM (>75% used most of the time); my daughter is using a laptop with 16 GB of RAM and my son a desktop with 32 GB of RAM. All run Windows 10. All of us have different needs and usage patterns, me being the extreme where the laptop is a portable email and light work tool and the desktop is the workhorse, my daughter using the laptop at school and my son only at home, so that's why he has a desktop. I am not making a case for 4GB of RAM, but 8GB is a enough for many people that don't use crappy apps or don't have special needs.
Oh, and my main development VM has 8GB of RAM; I can give it more, but I never used more than 5 or 6.
Expensive laptops is a very relative term and judging the value of an expensive laptop on the amount of RAM is dodgy; I expect build quality, great lifetime (my wife's Surface Pro 3 has 6-7 years of flawless operation), low weight, good screen - I don't have RAM in that list. Yes, if you need more RAM you buy a model with more RAM, but if you don't need it then you don't buy it, it is that simple for me - I agree others may see it differently.
TechRadar is a shitty kind of site. First, their site takes 100% of CPU whenever possible, along with 1/2 a gig of memory if you leave it open for more than a few minutes, as if to prove the point of the article.
They talk about technical things incorrectly, and I'm sure they'd say, "Well, we just want to appeal to regular people" instead of admitting they're not trying.
Their statement that that laptop makers will get more of a discount buying in bulk isn't false, but it's not really the issue. If laptop base models came with 16 gigs, the associated memory markup will still be there, yet they suggest otherwise as if the market somehow would rather help people than make money.
As far as the article goes, they discuss the symptoms of the larger problem, but not the problem itself - our apps, particularly browsers, use too much memory, and nobody is talking about ways to punish app makers and web sites that abuse this.
When I reluctantly had to buy a laptop due to covid I opted for a Dell Precision 7750. I bought the minimum 8GB from Dell and put 128GB in it myself with cheaper after-market RAM,and changed the thermal paste. It's a beast. Unfortunately Dell weren't offering any Ryzen CPUs
2 of my colleagues run SQL on their laptops; I have a lower spec laptop, so I run it on a desktop with 128 GB of RAM. Not something many people do, but 3 in my team of 8 do it.
2 of my colleagues run SQL on their laptops; I have a lower spec laptop, so I run it on a desktop with 128 GB of RAM
If you are a developer you absolutely do not need 128GB of RAM to run SQL Server! But if it convinces your management to buy you top spec kit, fair play to you.
They are developers, I am the manager, so they do more SQL work than me. Some of the databases we work with are over 1TB, a lot of RAM is needed to run queries fast (in memory). I prefer a good desktop any day to a half-baked laptop that is thermal constrained.
I was skeptical too but my laptop doesn't thermally throttle when running cinebench r23. I changed the thermal paste to Thermal Grizzly Kryonaut Extreme (I don't have the courage to use liquid metal yet). It's quicker than my admittedly old 4790k desktop to build our code
Ha-ha, I just bought Dell 3410 with 4GB of RAM. Of course I upgraded it to 16GB and I'm considering upgrading it to 32GB in a few years if necessary. 8 GB is not that bad. I would consider using 8 GB if not for virtual machines, but that's hardly an ordinary user use-case.
I don't think author of this piece understands how RAM works in Windows 10 or modern app nowadays, having allocated RAM and actually using it are two very different things
while in past 2GB RAM after launch of Windows seemed fixed regardless of how much RAM you have, nowadays, Windows will just take large portion of RAM regardless of whether you have 4, 8, 16 or 32, it's not fixed number and it's not required number
for instance I am running Windows 10 on laptop with 8GB RAM and I have currently available 3GB RAM with Edge and other app I use for work consuming combined 1.5-2GB RAM, so by author's logic Windows takes less than 3GB in total
Look at the performance per dollar on CPU, even with slower improvement rate on node and IPC, the compound performance added up over time if still significant.
For DRAM, especially when we moved to LPDDR, the price floor of dollar per GB hasn't moved at all over the past 10 years. This is a challenge when vendors' profit margin are constantly being squeezed.
> It’s unacceptable to charge $1,000 or more for a laptop with only 8GB of RAM now.
Consider Apple has their entry level laptop at $999. And most chrome book at below $400 are absolutely junk. I would rate $999 as high medium tier.
This article is based on watching Task Manager which I can't imagine is an accurate picture of how much RAM you "need". There is no reason to care about your machine swapping sometimes as long as it's subjectively performing well.
> Laptop makers buying RAM in bulk will always get it at a lower price than you or I would by going to Amazon.
This is not true if you bump the RAM amount in every machine you sell and end up using the world's supply of cheap RAM up. Build to order doesn't have that risk since most people won't do it.
I've been experimenting with throwaway ec2 dev boxes. My environment setups are scripted for setting up a new instance but I usually stop and restart the same one.
The instance shuts itself off using a cloudwatch alarm after 5 minutes of idle time. My IDE is vscode using remote connections, and a ssh client. Startup time is less than 20 seconds and I scripted that too.
Cost: about $0.15 for a few hours of work. Less vscode, it will work from any hardware built withing the last 10-15 years that has a ssh client.
Edit: that has a ssh client (not just a Linux terminal)
I feel similarly ripped off by expensive laptops with small hard drives. It would have been $800 more for a 1TB 15" Surface Book 2 than a 256GB 15" Surface Book 2 with otherwise identical specs. A 1TB hard drive is like $100. But you can't really swap the hard drive without peeling the screen off.
The hard drive size price issue is basically the same for the Surface Book 3, several years later. I like the design of the laptop but I won't be getting another one.
I've always considered the Surface Book line to be Microsoft's answer to Apple's Macbook Pro, and design-wise, it's a pretty good one. Detachable tablets with a real operating system and a dedicated GPU base are kind of the ultimate form factor (unless they could fit that GPU into the tablet).
But the specs fall short. Macbook Pro hard drives start at 512GB and go up to 8TB, and they have more ram at the high end as well. It's just annoying and I'm probably just going to try to build my own laptop in a year or two.
Totally agree, but that's not new, makers have always sold laptops with not enough RAM. Two years ago, I extended the 8GB RAM of my Lenovo E330, to 16GB, and now I can keep it several years more. Buying a desktop or laptop PC with only 8 GB RAM is a nonsense nowadays, but unfortunately, few people are aware of that. And if you build a brand new gaming computer, you have to put 32GB of RAM, if you want to keep it many years
I think buying new expensive laptops for personal use is waste of money anyway, unless you do something like heavy video editing or gaming where you really need the power.
For my daily use (programming hobby projects, studying, browsing) a 10-year old Thinkpad T520 with i7 is more than adequate after upgrading it with SSD and 16GB of RAM, especially when using Linux instead of Windows.
If you do something really intensive on CPU and/or RAM, use a desktop instead of a laptop, the expandability and thermals are much better. And 10 year old CPU is very slow and inefficient by modern standards.
I had an old Dell Core 2 Duo 2.66Ghz with 4GB of RAM from 2010 that I used mostly as a Plex server until 2019 running Windows 10 - with a spinning hard drive. I was surprised it ran at all. Let alone decently enough to run any one of Chrome, Word, or Excel. I would use it every now and then when I was working from home while interviewing.
I sometimes wonder if it wouldn't be better with some regulation regarding this. I'm not sure. But it kind of goes hand in hand with the whole repairability thing. And unrepairable devices have more externalities...
How much of RAM is actually full vs just cached stuff that can be purged when memory pressure increases?
Just because Windows task manager says 10 GB RAM are used doesn’t mean it could easily do with half of it, just foregoing a bit of preloading of apps/files that you might perhaps be using.
Meanwhile I'm playing with a pinebook pro. Comically underpowered arm processor and only 4 gigs of ram. The stock manjuro installation is fast enough and I've been able to to get started with OS development on it.
My wish list for a reasonably priced laptop: 16GB of RAM, a GPU that can at least play a game or too, and a decent (> 1080p) LCD panel. Seems like the line is at about $1,200 for this..
It also would be nice to be able to upgrade to 32 GB maybe few years into the future. Upgrading an older machine when needed is so much better than buying new ones all the time. I wish more HW would be built with upgradability/repairability in mind.
If I were pointing fingers, I would point more to React, NPM, webpack, and just the crazy world of frontend development in general.
But yeah, a browser is almost like a second OS running on the machine. Only it's built at a higher level (abstraction-wise) with much less focus on optimizing resource usage compared to a proper OS.
What this person is suggesting is that the everyone should have to spend another $100 and get worse battery life because the author has an irrational preference for inefficient software. My Pixelbook Go has never once exhibited symptoms of memory pressure, cost $800, runs all day, and I prefer it that way.
Such a generalization cannot possibly be taken seriously. The amount of RAM one requires varies with how the machine is used and even how often; a slow machine once in a while is less annoying than a machine that's slow all day every day.
I can't use an 8gb laptop because 1 vm or too many open browser tabs along with an app or two based in electron and after some time the thrashing commences.
My daughter is perfectly happy with the 8gb laptop I gave here. Kind of wish I could put half of that in my laptop. <g>
Instead of asking software vendors to make better apps, we ask hardware vendors to put more RAM.
Instead of asking an industry that can distribute the change in months, we ask the industry that will take years to change the entirety of the installed base.
Instead of asking for higher efficiency and better use of resources we already have, we accept bloat as a natural fact of life and accommodate to that.
Instead of maximizing the use of all the resources we have already mined and the energy we have already used, the devices that are functional but slow when used with bloated software, we encourage active obscolecence and further pollution of our environment, exploitation of resources in dubious onditions, more-than-grey working conditions, all in the name of "it is possible to do more, so we should do more".
Is this really the direction we want ?