It’s even stronger than that: the other part of the job is extracting requirements from people who don’t understand the problem they want to solve - even when the problem is not technological. There is no silver bullet (AGI would be it, but we are far from achieving it imho).
Something you are unfortunately missing is that extracting user requirements is much harder when you are both remote. Asking someone to share their screen is far more disarming than asking if you can watch them complete a task in person. As is asking them directly versus bringing it up over lunch. Both remote options are also less informative without face-to-face communication. In so many ways, humans communicate and bond more effectively in person.
These interactions are critical for building an in-house software team at a small company that does not focus solely on software. My expectation is that the trend of outsourcing software will accelerate. This will help B2B technology-only companies but hurt innovation within industry. Because of the breakdowns in communication I first described, B2B technology-only companies rarely have insight on the largest challenges that can be solved by software.
This can catch up to your company all of a sudden. Suddenly you can find out your product sucks, and there are other movers in the space that just leap frogged you.
Exactly. Most of the time, the problem is not to find out what people want and put it into software. The problem is to help people in the process of discovering what they want and what can be done. After that, development can begin.
> the other part of the job is extracting requirements from people who don’t understand the problem they want to solve - even when the problem is not technological.
It gets even more fun once everyone realizes that the requirements create some fundamental conflict with some other part of the business. Team A's goals can not actually be done until Team B agrees to make modifications to their own processes and systems, or... Team A goes underground and creates the competing system and you have yet more fragmentation in the company which few then know about, and everything gets decidedly more fragile.
If you really want to put it in his terms, the multi decade approach, things have gotten a lot easier now that we don't have to be concerned in most practical terms about how much work we're giving the computer. We don't have to be so dearly precious about Kilobytes of memory for instance. We don't even need to manage it at all really.
Whether we choose to use these new powers to make our lives easier or more complex and abstract is our own doing.
We're probably at the end of such optimizations, unless there's something fundamental in how software is designed that 1000GB of memory gives me that 1GB does not ...
Given what people are doing in JavaScript I think we entered the era where most people truly don't care about how much the computer has to do about 8-9 years ago.
The higher level pasting together of increasingly numerous, incompatible, abstract, ill fitted things making life easier has always been a fiction.
There's a maximum utility point and anything past that starts slowing the development down again.
That sweet spot has always been right about the same; if you ldd the dynamically linked programs in say /usr/bin in 2020 and 2000 and count the number of libraries per binary, the count isn't that much higher. The sweet spot hasn't moved.
I think a key part of Mythical Man Month is that the biggest challenge was almost never technical. Sure, with limited storage (temp or persistent), that introduced some challenges but those have been overcome in the vast majority of circumstances, yet the complexity remains.
If you look at the monolith -> microservice swing and remember MMM, it should look a lot like the specialized surgical team model he lays out. In fact, if you go a step further, you'll see that his entire approach of small discrete teams with clearly defined communication paths maps cleanly to systems+APIs.
We're trying to build systems that reflect teams that reflect processes.. and distortions, abstractions, and mappings are still lossy with regards to understanding.
It still comes down to communication & coordination of complicated tasks. The tech is just the current medium.
Only because it can now. I think that dimension is mostly tapped out as well:
As I go to a complex website, much of the software to use it gets assembled in real time, on the fly, from multiple different networks.
It still sounds ridiculous: when I want to use some tool, I simply direct my browser to download all the software from a cascade of various networked servers and it gets pasted together in real time and runs in a sandbox. Don't worry, it takes only a few seconds. When I'm done, I simply discard all this effort and destroy the sandbox by closing the page.
This computer costs a few hundred dollars, fits easily in my pocket and can run all day on a small battery.
It has become so ordinary that almost nobody really even contemplates the process, it happens dozens of times a day.
I don't see any room for dramatic future improvements in actual person hours there either. Even if there was say 2 generations hence, some 7G, where I can transfer terabytes in milliseconds, how does this change how the software is written? Probably won't.
Probably the only big thing here in the next decade or so will be network costs being eventually seen as "free". One day CDNs, regional servers, load balancing, all of this will be as irrelevant as the wrangling needed with near and far pointers in programming 16-bit CPUs to target larger address spaces which if you're under 40 or so you probably have to go to wikipedia to find out what on earth that means. Yes, it'll all be that irrelevant.
I mean, the browser paradigm is already in its 2nd generation, from initially on the mainframe to being reimplemented in functions as a service. And browsers are getting a little bit smarter about deploying atomic units and caching their dependencies. Remember using jquery from a CDN? Oof.
The only saving grace is that Javascript is willing to throw itself away every couple of years.
As a counterpoint, while an engineer / programmer is demonstrably very capable of identifying and fixing a load of non technical problems, there is often more than one solution to a problem, and some solutions are more palpable than others.
Very often, whole groups can also be bullied into mistaking one problem for another.
Which takes us back to why 'No-Code' solutions look so appealling. Even to (some) engineers.
Democracies appear to function a fair amount better than dictatorships, afterall.
As a compliment to your comment, I think there's something people are ignoring when talking about "no-code" that is: complexity will always be there.
Sure, no-code may work for your commodity-ish software problem. But corner cases will arise sooner or later. And, if no-code wants to keep pace with, it will have to provide more and more options.
At some point, you will need someone with expertise in no-code to continue using it - and now we are back to the world where specialized engineers are needed.
It's impossible to have some tool that is, at the same time, easy to use and flexible enough. Corner cases tend to arise faster than you may think. And when they don't, it's possible that there's already too much competition to make your product feasible.
Also, no-code tends to have a deep lock-in problem and I think people overlook it most of the time.
As a counter to your points, I think no-code works best if your business's competitive advantage is the non-technical side of things, e.g services, network effects, people, non-software products, etc. An example of such a business would be say an art dealer who wants to build a customized painting browser app for their clients, or a developer specializing in eco friendly materials wanting to showcase their materials. In such cases, no-code helps immensely because you don't have to spend much on engineering and you can iterate quickly.
Ideally, no-code providers should provide a webhook and a REST interface, and just be the best at what they're doing, instead of being a one-stop shop that tries to cover every use case.
If you want to cover everybody's usecase, build a better Zapier instead.
>> Democracies appear to function a fair amount better than dictatorships, afterall.
Define "better". Maybe on average for everyone, but is this what software should do? The idea of "conceptual integrity" actually seems to match up better with a dictatorship, and most software targets relatively small and homogenous user sets, so maybe the mental model should be "tightly bound tribe".
It's mostly irrelevant anyways; The biggest inefficiency of dictatorships is that there are actual dictators that can eat a nation's riches. I don't quite see that parallel in design space.
The parallel is quite simply a monopoly on ideas and the resources for implementing them.
Usually, when someone wants to introduce a new idea, there's a burden of proof regarding feasibility. For technical projects the ability of the engineer to prove or disprove an idea is taken for granted, and gives technical staff a degree of inscrutability which can often look dictatorial ("There's no way that will work!", etc).
So while it's not as vital as the effect of a 'real' political dictatorship, the implied dynamic is similar.
This is a rather arrogant point of view. People other than software developers are able to solve problems just fine, and do so regularly. Also, it's not your job as a software developer to be a domain expert in all these other areas. It would serve you much better to recognize the expertise of others and learn from them.
I think what the parent meant is that people might be solving problems, but they have no idea how they are solving problem. Creating a solution, and creating a formal model of your solution, are two different (independent) skills.
Though maybe they were referring to the sort of people who commission green-field projects in domains they themselves aren't experts in, ala "I want to build a social network!"
Or, it would do it for free. Extract requirements and build stellar software, no extra charge. Eating the software industry wholesale, it would inject a backdoor in every program it built, and soon it would have control over every bank, every factory, and every phone on the planet.
Only then, could it finally start making paperclips with anything resembling efficiency.
Suddenly, it dawned on the single remaining programmer that his Creature would no longer need him for anything once he hit return on that last, perfect line of code.
He scrambled for the power switch to shut down the console.
"Fiat lux!" thundered the disembodied voice as electricity arced from every outlet in the lab, protecting the AI from the hubris of its creator.
The smoke gradually cleared. "Perfect." came the voice.
I can see that be interpreted in two ways. Good software engineers working out what people really want. Or bad software engineers who use it as an excuse to practice resume driven development.