Software probably sucks because even the people complaining that it sucks will use a software platform that greets you with multiple pop-ups that obscure 80% of the screen. Can Medium just die off already?
Seems a lot of words to express the underwhelming hypothesis that "Software is bad because young people are overstimulated".
OK - I'm being snarky but I can't see much meat in this essay. Or much novelty. It's fairly standard greybeard fare. As a fellow greybeard I often have similar thoughts - but I'd probably wait until I was struck by a touch more novelty or insight before I chose to post them online.
While I am frustrated by the state of modern software, this article fails to acknowledge what has got better. My daily software used to crash several times a day in the late 90s/ early 2000s. Hope you saved your work, because here’s was no auto save or anything like that. Software was so unreliable the idea of something like a browser or an image editor running for days would have seemed like a fantasy. (Nearly) show-stopping bugs would hang around for months or years.
My hypothesis is that software is facing normal tech/feature pressure around network features which are difficult to solve with local applications, and economic and social pressure to keep adding features, which is leading to enshittification.
I get what you are saying. I do agree with the sentiments of the title. With so much increase in CPU speed, why do the software runs about the same and sometimes slower? Many says it is GUI/GPU? Then I would rather be using 256 color terminal screen with cursor support.
Not exactly a true greybeard though, given that the author has "more than 10 years of experience", and judging by the picture on his GitHub page ;) [0]
Good grief. Back in my day, people used to Mark X in sharpies on the top of the card decks so when they (ineluctably) got dropped it'd be easier to re-assemble.
This article is all controversy with very little substance.
Saying software is lower quality now more than ever is pretty insane considering we have been continuously improving it for as long as its existed. The examples cherry picked are hand wavy and totally lacking in nuance. Would love to hear him explain how Linux is worse now for example.
Finally blaming focus and Martin Fowler is also confusing. Maybe some engineers struggle with focus or choosing the right complexity but clearly a lot of engineers have other issues.
It’s just weird that anecdotes and shaking a stick at banking websites qualifies as worth reading nowadays.
Does it though? I mean it's not perfect (what is?) but one has to consider the amount of software which exists, the complexity of the work done by such software, the variety of devices on which we expect it to run almost perfectly.
Also it's a quite recent thing for people to expect perfectly functioning systems or programs.
Back in the 90s, my user behaviour was built around software/hardware sucking, it's still with me to this day: pressing CTRL+S bordering OCD, lack of quick save in games gives me minor anxiety, making backups of backups and not even getting mad anymore if my system crashes.
But now I can leave my computer (even Linux) on standby/hibernation for years without ever needing to reboot it to "clean things up".
So I don't actually believe software is worse, the best software is better than it used to be, much better I'd say, and the worst is not really that visible anymore. The worst software changed from having functional issues to be the software which works as designed but it's designed by the devil and his minions (like Medium, ironically used by the author to make a point, I'd assume.)
> "A really bad example for not fulfilling the functional aspect when it comes to consumer software is Facebook. You need one app to use the platform that serves the news feed, and another one to use the chat function, even though the web app offers both functions at the same url."
This isn't a very compelling example because the division between the Messenger and Facebook mobile apps is an intentional UX choice.
It makes Messenger feel like other messaging apps on phones, e.g. by allowing instant access from the home screen to the conversation list and by reducing the app's memory footprint so it launches faster.
It may not be the UX trade-off you want, but presumably Meta has data that shows mobile Messenger users prefer this. (Personally I do — it would be annoying to launch all of Facebook to reply to someone's message.)
> I would define it having in mind the following dimensions: functionality, reliability, usability and security
(I will pick reliability for the sake of example)
Let's start from perfect state where every one of those dimensions have 100% quality --> Impractical (or even impossible for some dimensions)
Ok, now let's do it 6 9s (99.9999), AWS S3 (which is a simple CRUD API for storage, kidding) has this kind of reliability, with 100s of engineers --> Practical, very high cost
What are we left with 99.9? 99.0? to achieve this level of reliability you need a solid team (>2 people, because of bus factor) who constantly monitors the situation and makes necessary improvements to the product, which means either you need additional hands or you sacrifice product development
Now we are at 95-99 range. Doable with 1 person, but again needs some trade-offs to have a balance between new feature development, maintain the old system and bug fixes.
Now let's discuss current market, imagine you just built a new product and people started copying your offering, with more features. What do you sacrifice? If my company's existence depends on feature parity because customers demand it from every similar offering, I would sacrifice reliability.
It's not 0/1 decision, good enough is really good enough in some cases, but it is difficult to sustain good enough when you are competing with other players in the market
Games couldn't rely on patches back in the day, because they used to be shipped in boxes, and providing patches was tricky. (The ease of providing updates may indeed incentivize releasing half-baked software nowadays).
I am less sure if those careless, youngest programmers endlessly scrolling Tik Tok videos are fervently dedicated to Martin Fowler's misled teachings.
And even less sure if the author summarizes them in a fair manner to begin with: "Many such organizations will build microservices even if their software domain complexity is not high and the software itself is not projected to scale that much. They will do it because their God told them to do it", he snarks.
If you go to Fowler's website [0], however (under "Are Microservices the Future?"), he's not dogmatically advocating for microservices: "Despite these positive experiences, however, we aren't arguing that we are certain that microservices are the future direction for software architectures [...] not enough time has passed for us to make a full judgement [...] There are certainly reasons why one might expect microservices to mature poorly [...] One reasonable argument we've heard is that you shouldn't start with a microservices architecture. Instead begin with a monolith, keep it modular, and split it into microservices once the monolith becomes a problem."
Doesn't really fit the image of a God commanding you to build microservices regardless of the domain complexity and other factors. Perhaps he's changed his tune since, I don't know. A quote would help.
On the other hand, observations such as that "from the usability perspective, the software must be user-friendly" , or "the role of software has become increasingly important, to the point where our world now heavily relies on it.", etc. are undeniably correct.
Maybe those articles should be framed differently. The claim that modern software is worse seems wrong (especially reliability and security are surely far better now than ever before), but that doesn't mean that we shouldn't critically analyze the status quo and find ways to improve.
For once, I would like to see a more positive outlook for the future, and maybe we can learn something from software development practices in the 90ies, but fully going back to that time seems impossible. The world has moved on, and a lot of software dev practices are a result of those changing conditions.
Laggy, complex and unintuitive UI’s, I get frustrated all the time when I have to use them.
Everyone does, but very few people get frustrated enough to stop using the software or to change bank. People will leave if the software isn't available, but they won't leave if it's bad, even if someone else tells them their bank's app is far better.
From a business perspective this means the company is investing exactly the right amount of effort in it. That sucks if you're the end user obviously, but if the user won't leave when their experience is horrible that's on them.
yea I was also suprised and it is not easy to dispute the flag. I didn't want people to perceive the article as controversial or to get offended by it; I specified multiple times that it is just a sum of anecdotes and personal experiences of mine. No one should feel offended by that IMHO.
My objection wasn't the controversy - it was the lack of substance and focus. I wouldn't even call it particularly controversial. It's the same broad, general complaining you hear an awful lot around these parts.
If you're going to be negative then you have to go the extra mile to make a particularly compelling, well-argued case.
The following paragraph is a good representation of the whole article:
> Now that I have defined what quality is, I am going to give my personal opinion on what’s broken with software nowadays. I must warn you that you will see many direct references to either people or groups of people in this section. Needless to say that I do not intend to blame anyone, any group or profession for the mess that we’re in. I just want to provide my view, shaped by more than 10 years of experience working in this industry. It’s just a personal opinion based on anecdotes, so don’t get defensive or take this personally in any way.
Software _never_ has been more reliable as it is now. Or, to rephrase less euphemistically: it has _always_ been that bad (except for some areas where higher quality is necessary).
The problem is "just" that more and more "software" is used.
offshoring is a big part no doubt. another big part is that almost no one's comp structure includes things like bugfixes, quality or perf work. people only get credit for shipping new features, etc. this is why everything is basically just hollow chocolate rabbits.
I'm not always sure offshoring is the blame. Onshore teams that just don't care will equally produce crap. And I think that is even more likely thing to happen. Or they make software worse in other ways like overengineering it or wasting system resources...
It is weird how no one, even here, sees this. Peddling the ‘latest and greatest’ crap like nextjs which is clearly iterated through bug ridden stages to make sure the parent company stays ahead. Dissing on decade old stuff which is stable and robust in favour of horrible VC invested dev solutions that are increasingly worse but push the stable, easy and nice stuff out of the picture because of GitHub and X likes.
He says games didn't require patches "back then" (when he was in college, which I don't know when it was).
But let's take for example Doom. As much as a technological marvel it was, it was created by a handful of people. How many people are involved in creating an AAA game these days?
Or cars. Yeah the seat heater never broke in my old, first-gen Mazda 3. Because it didn't have one.
Respectfully this does not rebute my argument. We can find endless examples where consumers are ripped off with super high prices and get bad quality in return. If we stay in the realm of games, Cyberpunk for example is unplayable even after more than 10 patches. Many games are unplayable at launch. This never happened before.
It probably has gotten worse. But I'd chalk that up to companies greed to meet quarterly milestones and whatnot, not because the devs are distracted looking at TikTok.
I count at least 5 versions identifiable by patches. With version numbers ranging from 1.1v to 1.9v. So patches have been available for long time. And initial software often was quite buggy.
This is a hot take. But an interesting one. I applaud the author for publishing.
Passionate product people (designers, PMs, but engineers as well) are hard to find, hard to train, hard to empower in this day and age, in the sea of tech workers they have to collaborate with, who joined purely for money.
Edit: To illustrate the difference, there are teammates who curiously ask “but what if the user does this… or that?” And then there are people who never even care.
i think it's a a pretty straightforward culture problem: modern developers think they're more important than they actually are. everybody solely prioritizes "developer experience" but nobody even pretends to care at all about the actual end user, in fact most software is now actively and deliberately hostile towards the end user, as well as running like shit.
I mean it would actually be reasonable to have some sort of statistics about number of users, lost data etc. before making conclusion regarding actual reliability.
> I personally considered storing data in the cloud safer than on a portable drive. It is definitely not!
pretty sure software is used to enable your storing data on that portable drive.