> I wish companies would go back to building fast apps
It seems fascinating how much more efficient Windows apps were back in the nineties, capable do to almost everything the same today apps do in a similar manner on orders of magnitude less powerful hardware, often performing even faster.
The last time I expressed this, probably also here, somebody suggested the performance drop is the cost of modern security - vulnerability mitigations, cryptography etc.
I think the performance drop probably has more to do with managers & product folks expecting growth and features at all costs at the expense of keeping performance at some baseline.
I also wonder if it's just harder to continually monitor performance in a way that alerts a team early enough to deal with regressions?
That said, security _can_ impact performance! I work on Stripe frontend surfaces, and one performance bottleneck we have comes from needing to use iframes to sandbox and isolate code for security. Having to load code in iframes adds an additional network load before we can get to page render.
I think you need to add that many more developers, or even teams of developers, are building these apps. They have their own fiefdoms and it's less common for devs to have a complete understanding of the code base.
Over time decisions are made independently by devs/teams which cause the code bases to get out of alignment with a performant design. This is exacerbated by the growth pressure. It's then really hard for someone to come in and optimize top to bottom because there is everything from a bug to a design problem. Remediation has significant overhead, so only things around the edges are touched.
Fast forward a couple of years and you have a code base that devs struggle to evolve and add features to as well as keep performant. The causes are many and we come to the same one whether we complain about performance or maintainability. You probably don't feel this way, but Stripe and other premier engineering companies are way ahead of others in terms of their practices when you compare with the average situation developers are facing.
Independent mobile development is often where I see most attention to performance these days. The situation for these devs is a little bit closer to what existed in the nineties. They have a bigger span of control and performance is something they feel directly so are incentivized to ensure it's great.
> It seems fascinating how much more efficient Windows apps were back in the nineties
I remember everyone complaining about how bloated they were at the time. Pretty sure someone in 2055 is going to run today's Office on 2055 computers and marvel at how streamlined it is.
My recollection is completely different, software was really slow on contemporary PCs in the 90s. Spinning disks, single core cpus, lot more swapping due to memory being so much more expensive.
Contemporary software was slow. You could tell you should consider more RAM when the HDD light and chugging noises told you it was swapping. But if you ran the same software with the benefit of 10 years of hardware improvement, it was not slow at all.
Nah, it's about favouring bad programming practices, not thinking about the architecture of the software and giving developer experience a bigger role than end user experience. All these stemming from a push to get to market earlier or making employees replaceable.
I'd guess less on "bad programming practices" and more "prioritizing development speed?" Mostly inline with your next point. We value getting things released and do not have a solid process for optimizing without adding new things.
Ironically, it was specifically the longer feedback cycle of long builds and scheduled releases that seems to specifically have given us better software?
Oh, I likely fully agree with you on it. I'm just pointing at the hazard that a lot of these practices aren't, intrinsically, bad. Rapidly getting something done is, generally, a good thing. I'm not entirely sure how to make it not the priority, but it does feel that that is the problem.
This depends on whether you want to exchange data with other computers using even removable media, let alone the Internet. Also whether you want to use Unicode. In case you only want to hand-type, edit and print a plain English paper right away by dumping plaintext/PostScript/PCL to LPT you probably are fine with any computer you can find. It's just nobody is using standalone computers anymore, almost every modern computer is a network device.
I mean, as mentioned upthread, Office 97 (and Office 95 before it) was slow to load, so slow that they added the start up accelerator.
You can run Office 97 now and it'll start fast because disk i/o and cpus are so much faster now. Otoh Excel 97 has a maximum of 256 columns and 64k rows. You might want to try Excel 2007 for 16k columns and 1M rows, and/or Excel 2016 for 64-bit memory.
The 90s was a time when computers were doubling in speed every 18 months.
I remember office 97* being lightning fast on. A 366mhz celeron - a cheap chip in 1998.
You could build fast software today by simply adopting a reference platform, say. A 10 year old 4core system. then measuring performance there. If it lags then do whatever work needs to be done to speed it up.
Personally I think we should all adopt the raspberry pi zero as a reference platform.
Edit: * office 2000 was fast too with 32 megs of ram. Seriously what have we done?
It seems fascinating how much more efficient Windows apps were back in the nineties, capable do to almost everything the same today apps do in a similar manner on orders of magnitude less powerful hardware, often performing even faster.
The last time I expressed this, probably also here, somebody suggested the performance drop is the cost of modern security - vulnerability mitigations, cryptography etc.