> Arguably, the imperative programming paradigm is a more natural fit with the von Neumann computer architecture
Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general. That is a bit of a blow to all the programmers who seem to secretly want to be mathematicians because that in turn suggests that the logical aspects of programming are subordinate to hardware realities.
I've gotten a lot of joy out of the Neural Networking fad similarly eclipsing the logic-based AI people. Logic is important and it isn't going away, but reality has too much uncertainty for simple logic to work in practice. The statistically grounded approach makes me happier, and again computer hardware's power is overwhelming the efforts of the logicians to tie everything down to certainties.
Great language though, everyone should take a look at it to see what a different programming model might be.
> Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general.
No... it doesn't suggest that. If there's one thing I've learned all these years I've learned about various different softwares and languages, it's that popularity does not correlate with power.
I'll take Linux over Windows, Archlinux over Ubuntu, Haskell over Java, i3 over Gnome, CLI over GUI, vim and shell over IDEs like Eclipse. I've seen "modern" popular languages adopt features from "obsolete" unpopular languages in ways that pale with the originals, like Python's lambdas and assignment unpacking, which suck.
The rest of your post notwithstanding, I disagree with the logic of your first sentence.
That seems like a misreading of grandparent. Their point was that thinking of programs as logic isn't a powerful way of thinking because it doesn't match the way computers work. That's different from power of programming languages per se.
A language could be very powerful in your sense, but abstract away from some important aspect of computation. (For example, Haskell is very powerful, but the space complexity of a Haskell program is hard to determine from the language spec.) Then such a language wouldn't be a "powerful way of thinking" in the sense of grandparent - in thinking about some aspects of computation it would hinder rather than help. The same is true for Prolog, I think.
I feel it's meant to be a part of the features that try to make Python more viable for functional programming, but when you compare it to the equivalent in Haskell, for example:
Tuple/list unpacking has been in Python for a long time and likely has nothing to do with catering to the functional crowd, which Guido isn't known for to begin with. Anyway...
> * You can't nest, like it was mentioned
((x1, y1), (x2, y2)) = makeLine ...
This works fine in Python:
((x1,y1), (x2,y2)) = ((1,2),(3,4))
I don't know what "makeLine ..." returns, but if it doesn't exactly match the tuples on the left hand side, it won't fly in Haskell either.
line = p1, p2 = ((x1,y1), (x2,y2)) = ((1,2), (3,4))
and is more readable IMHO.
Point 3, 4 and 5 don't make sense in Python. This is pattern matching, not tuple/list unpacking. (One could argue that tuple unpacking is a form of pattern matching, but that is a different story...)
First example works fine in both Python 2 and 3. Second example works in Python 2.7 and earlier, but tuple unpacking for arguments was removed in Python 3. [1]
Both of these were intentionally removed around Python 3, iirc: python 2.7 allows nested tuple/list destructuring as well as destructuring in function arg lists.
Considering how fashion-driven the programming world is, popularity (of the lack of it) of a language or technology says very little about how powerful it is. Prolog's way is very powerful, it's just very different from just about every other language out there, which makes it hard to learn and harder to master.
(As an aside, in my experience, people who believe that "programming == math" seems to be mostly attracted to languages like Haskell and OCaml, not so much to Prolog.)
Prolog may suffer from a lack of popularity, but I'd argue that's not because it's lacking in power as a means of expressing an idea. Nor is it due to a hardware paradigm mismatch.
Lisp and its variants (especially Clojure) enjoy increasing popularity for all sorts of general-purpose use. Lisp is merely a particular notation for expressing lambda calculus, and is rather far removed from the realities of Von Neumann hardware.
I'd argue Prolog's demise is due to three facts: (a) the sorts of ideas best expressed in Prolog have diminished due to new languages becoming available, (b) the remaining ideas best expressed in Prolog are only applicable to a narrow set of problems, and (c) Prolog itself isn't the most ergonomic language to use, so it isn't often people's first choice when alternatives are available.
You don't even have to look at recent languages... probably the best example of (a) is SQL. SQL arrived a couple of years after Prolog in the 70's, but over the decades SQL has absolutely dominated. Unfortunately for Prolog, SQL provides much of the same functionality while also having the advantage of being a more natural fit for business applications which drove adoption. (and quite likely directly hurt Prolog's prospects)
Can you elaborate with an example or two? I’ve always been fascinated by Prolog and thought of it as a language that is very well suited to a specific class of problems. It would be interesting to know what class of problem Prolog excels at and why newer languages or new features in existing languages can do what only Prolog did before.
I'm afraid I can't -- sorry. I'm not a Prolog practitioner. My above observations are just culled from what I've read over the years. I suspect some kinds of expert systems remain best implemented using some Prolog.
Perhaps someone who uses Prolog regularly can chime in.
I would argue that lambda calculus without pervasive lazy evaluation is not lambda calculus. Lisps in general don’t have it (unlike Haskell) and make them not so removed from hardware (once hardware stacks were added to CPUs). For long Lisps didn’t even have lexical closures!
Prolog is really a database query language. Similar to SQL or GraphQL. Prolog failed because it never got integrated into a serious enterprise data storage engine. (Most Prolog implementations are just text files and in-memory hash tables.)
I thought Datalog did get integrated into some "proper" storage engines. It is a subset of Prolog, and expressively more like SQL, and designed for query optimisation etc. It was certainly taken seriously for databases for quite a while.
> Logic is important and it isn't going away, but reality has too much uncertainty for simple logic to work in practice. The statistically grounded approach makes me happier, and again computer hardware's power is overwhelming the efforts of the logicians to tie everything down to certainties.
http://incompleteideas.net/IncIdeas/BitterLesson.html
takes this even further, suggesting that even the statistically grounded, human-enriched approaches are loosing way to simpler methods fuel by hardware advances.
I think we've got about 12 years of that left. There are three hardware waves coming:
1. In about 12 years we will have another "tick" of general purpose compute doubling. This will happen as sensible, known architectural changes and the slowed progress of Moores type hardware development come together.
2. We will have a wave of hetrogenous and specialist hardware architectures that will bump things along.
3. The hardware manufactures will crack and license / sell servers on a core/hour basis. This will allow people to burst compute on 100's or 1000's of cores without the always on capital commitment. This won't impact on the cloud providers but will provide on prem with a lease of life and allow people who can't migrate to the cloud due to legacy etc an escape hatch.
After that, I see a real choke on compute progress; expect a doubling every 50 years at best. The current "Moores" rate is 20 years, but that's based on the current investment fat industry. Once investors get there heads around the technology realities I expect all the cash to come out of chip making really quick. Innovation will crash stop.
At that point it's going to be software or bust for AI. I predict software...
> Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general. That is a bit of a blow to all the programmers who seem to secretly want to be mathematicians because that in turn suggests that the logical aspects of programming are subordinate to hardware realities.
Of course it's powerful. The problem is that it's far removed from, and in many cases in conflict with, how computers actually perform computation.
Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general. That is a bit of a blow to all the programmers who seem to secretly want to be mathematicians because that in turn suggests that the logical aspects of programming are subordinate to hardware realities.
I've gotten a lot of joy out of the Neural Networking fad similarly eclipsing the logic-based AI people. Logic is important and it isn't going away, but reality has too much uncertainty for simple logic to work in practice. The statistically grounded approach makes me happier, and again computer hardware's power is overwhelming the efforts of the logicians to tie everything down to certainties.
Great language though, everyone should take a look at it to see what a different programming model might be.