> The problem with programming is not the syntax and other particularities, but inherent complexity of explaining the task to a computer.
Not really. A great deal of the the complexity is accidental, not inherent. Take the example from the post of adding up some numbers in a table. The inherent complexity is very low, the complexity comes from all the stuff not inherent to the problem itself.
Programming, in the traditional sense, has to be formal and unambiguous, that's the main issue. The complexity can arise from seemingly nothing. Here's an extreme math counterexample: Fermat's last theorem. Proving it seems easy on the surface because it's so trivially formulated, yet it took centuries for mathematicians to actually do it.
Commoditized programming faces the same problem: many problems seem easy at first, but when you are trying to solve them with your shiny low-entry-barrier interface, you are hitting a fundamental ceiling you never knew existed. There's never been a shortage of unsuspecting newbies trying to parse HTML with regular expressions, as an example.
There were plenty of attempts to make end users program things.
LISP machines and Genera (and Emacs, which is IMO the closest existing thing to the "ubiquitous programmability" you propose). BASIC, which was the main user interface for many early personal/home computers. Visual programming environments. Spreadsheets. Some of them survived and are very useful for simple cases, like programming materials visually in Blender, or automating stuff in IFTTT, but all of them suffer from the same issue: [non-AI] computers are too dumb and expect more or less exact instructions. That leaves the complexity on users' shoulders. Once you go Turing complete (and often even without that), syntax or entry barriers don't matter much - you either train for years to be able to formulate the human-generated problem, encode it for a computer, and change it as the need arises, or you hit a wall with a low entry barrier tool.
Programming is easy to learn, sure, there's no reason it should be hard. But it's fundamentally hard to master. And it's not about syntax or high friction interfaces (which can often be more productive for a trained person solving a hard problem, actually).
> There were plenty of attempts to make end users program things. LISP machines and ...
I would say Lisp-machines were designed for professional programmers not end-users. First of all they were so expensive that only Bill Gates could afford them :-)
Real world is always complex, ambigous and changing. So any simple model will in time be found inadequate, and complex models unusable.
The problem of programming isn't so much the interface. Text has been used for decades and remains a robust communication platform among humans. The real problem is defining the problem to be solved, its scope and adaptability to a complex and changing world.
The difficulty is the gap between vague ideas and real world outcomes of automation and human-computer interactions. To codify and implement ideas require precise understanding, design and adaptability which are otherwise demanding, understated and neglected.
Notice there is little need to focus on the tools themselves in this realization.
Update: Adding a bunch of numbers in a list is never a real problem, so is just an artificial construct.
The question shows that you have a hard time to understand the complexity of it. Maybe the reason is that you are an experienced developer who deals with all the complexity quite naturally and automatically.
So why is this task complex?
Well, what happens if there are no numbers in the table to begin with? Is the sum of no numbers a zero or is it supposed to be some error?
And are all the numbers supposed to be natural numbers or can there be e.g. irrational and complex numbers be in there? If so, how much precision do we need when summing them up - and how important is performance?
Also, what if there are not only numbers but other things (dates, text, ...) by accident?
And in a real world scenario, what happens when numbers are added to the table during the calculation? Should they be considered or ignored - should the table be locked somehow?
The problem with these questions is not that they are necessarily hard to answer (often they are) but that people don't even know, that they need to be asked in the beginning.
Many developers are so used to it, they often don't understand that they are doing things that are difficult for normal people.
In the same way mathematicians find it easy to do basic algebra for some more difficult task, it's just part of there toolkit - whereas most people don't even understand this basic part from the beginning, let alone more complex problems.
> The problem with these questions is not that they are necessarily hard to answer (often they are) but that people don't even know, that they need to be asked in the beginning.
Right. And what makes it even more problematic is that answers to these questions affect what answers are possible to other questions. So it is not the sum of the difficulties of the answer to each question but their product, the combinations of different possible answers.
Adding the numbers is not the problem, the problem is what the numbers represent, where do they come from (if they come from only one place...) where and for what are they needed.
Quickly you have to consider different kinds of numeral systems, bases, decimal separators (which may be locale dependant), precision, units, the difference in adding time (or dates), or money in different currencies (with and without different kind of taxes), etc, etc, etc.
Sure most of the times you can abstract yourself from all this, use sensible defaults, whatever, but that complexity is still there and sooner or later it will get you, even if you try to hide it under the rug...
The complexity and misunderstandings also creep into system design itself. Seems many are eternally astounded that transactional data is "duplicate data" stored separately from the "same data", etc. Many of these complexities are non-obvious and unintuitive, until you are forced to think it through step by step yourself. Agile is simply the concession that the complexity cannot always be handled up-front, but that solutions must be developed to be adaptable to new observations and realizations.
You're starting from a solution. What exactly is the problem in this case? Generate a balance statement? You will have to divide it into smaller problems until each problem is small enough that it has an obvious solution. The fact that you only need an addition of some numbers in a table is the result of lots of thinking that you just glossed over.
How many jQuery table plugins are there? Sort, filter, pagination, partial display. And yet it is impossible to find components that meets all requirements.
Not really. A great deal of the the complexity is accidental, not inherent. Take the example from the post of adding up some numbers in a table. The inherent complexity is very low, the complexity comes from all the stuff not inherent to the problem itself.