I agree that "easier to understand" could've helped. But things that are hard to understand can get uptake if they're big wins. The bigger problem in my opinion is that the number of cases where Prolog was a big win significantly decreased over time. When it appeared in the '70s, Prolog's declarative-programming approach based on logic had very few peers where you could do even simple textbook examples in as nice a way. But now even SQL (with features like recursive queries) can do a lot of the intro-level Prolog examples. It doesn't have the full logic semantics with unification, but a lot of problems don't need them. SMT solvers, LINQ, and rules engines like Jess/Drools are a few other declarative paradigms that ended up eating into some of what Prolog proponents once saw as its space. If Prolog were up exclusively against FORTRAN77 or K&R C, there would be many problems where it's a big win, but that's not the competition anymore.
> Arguably, the imperative programming paradigm is a more natural fit with the von Neumann computer architecture
Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general. That is a bit of a blow to all the programmers who seem to secretly want to be mathematicians because that in turn suggests that the logical aspects of programming are subordinate to hardware realities.
I've gotten a lot of joy out of the Neural Networking fad similarly eclipsing the logic-based AI people. Logic is important and it isn't going away, but reality has too much uncertainty for simple logic to work in practice. The statistically grounded approach makes me happier, and again computer hardware's power is overwhelming the efforts of the logicians to tie everything down to certainties.
Great language though, everyone should take a look at it to see what a different programming model might be.
> Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general.
No... it doesn't suggest that. If there's one thing I've learned all these years I've learned about various different softwares and languages, it's that popularity does not correlate with power.
I'll take Linux over Windows, Archlinux over Ubuntu, Haskell over Java, i3 over Gnome, CLI over GUI, vim and shell over IDEs like Eclipse. I've seen "modern" popular languages adopt features from "obsolete" unpopular languages in ways that pale with the originals, like Python's lambdas and assignment unpacking, which suck.
The rest of your post notwithstanding, I disagree with the logic of your first sentence.
That seems like a misreading of grandparent. Their point was that thinking of programs as logic isn't a powerful way of thinking because it doesn't match the way computers work. That's different from power of programming languages per se.
A language could be very powerful in your sense, but abstract away from some important aspect of computation. (For example, Haskell is very powerful, but the space complexity of a Haskell program is hard to determine from the language spec.) Then such a language wouldn't be a "powerful way of thinking" in the sense of grandparent - in thinking about some aspects of computation it would hinder rather than help. The same is true for Prolog, I think.
I feel it's meant to be a part of the features that try to make Python more viable for functional programming, but when you compare it to the equivalent in Haskell, for example:
Tuple/list unpacking has been in Python for a long time and likely has nothing to do with catering to the functional crowd, which Guido isn't known for to begin with. Anyway...
> * You can't nest, like it was mentioned
((x1, y1), (x2, y2)) = makeLine ...
This works fine in Python:
((x1,y1), (x2,y2)) = ((1,2),(3,4))
I don't know what "makeLine ..." returns, but if it doesn't exactly match the tuples on the left hand side, it won't fly in Haskell either.
line = p1, p2 = ((x1,y1), (x2,y2)) = ((1,2), (3,4))
and is more readable IMHO.
Point 3, 4 and 5 don't make sense in Python. This is pattern matching, not tuple/list unpacking. (One could argue that tuple unpacking is a form of pattern matching, but that is a different story...)
First example works fine in both Python 2 and 3. Second example works in Python 2.7 and earlier, but tuple unpacking for arguments was removed in Python 3. [1]
Both of these were intentionally removed around Python 3, iirc: python 2.7 allows nested tuple/list destructuring as well as destructuring in function arg lists.
Considering how fashion-driven the programming world is, popularity (of the lack of it) of a language or technology says very little about how powerful it is. Prolog's way is very powerful, it's just very different from just about every other language out there, which makes it hard to learn and harder to master.
(As an aside, in my experience, people who believe that "programming == math" seems to be mostly attracted to languages like Haskell and OCaml, not so much to Prolog.)
Prolog may suffer from a lack of popularity, but I'd argue that's not because it's lacking in power as a means of expressing an idea. Nor is it due to a hardware paradigm mismatch.
Lisp and its variants (especially Clojure) enjoy increasing popularity for all sorts of general-purpose use. Lisp is merely a particular notation for expressing lambda calculus, and is rather far removed from the realities of Von Neumann hardware.
I'd argue Prolog's demise is due to three facts: (a) the sorts of ideas best expressed in Prolog have diminished due to new languages becoming available, (b) the remaining ideas best expressed in Prolog are only applicable to a narrow set of problems, and (c) Prolog itself isn't the most ergonomic language to use, so it isn't often people's first choice when alternatives are available.
You don't even have to look at recent languages... probably the best example of (a) is SQL. SQL arrived a couple of years after Prolog in the 70's, but over the decades SQL has absolutely dominated. Unfortunately for Prolog, SQL provides much of the same functionality while also having the advantage of being a more natural fit for business applications which drove adoption. (and quite likely directly hurt Prolog's prospects)
Can you elaborate with an example or two? I’ve always been fascinated by Prolog and thought of it as a language that is very well suited to a specific class of problems. It would be interesting to know what class of problem Prolog excels at and why newer languages or new features in existing languages can do what only Prolog did before.
I'm afraid I can't -- sorry. I'm not a Prolog practitioner. My above observations are just culled from what I've read over the years. I suspect some kinds of expert systems remain best implemented using some Prolog.
Perhaps someone who uses Prolog regularly can chime in.
I would argue that lambda calculus without pervasive lazy evaluation is not lambda calculus. Lisps in general don’t have it (unlike Haskell) and make them not so removed from hardware (once hardware stacks were added to CPUs). For long Lisps didn’t even have lexical closures!
Prolog is really a database query language. Similar to SQL or GraphQL. Prolog failed because it never got integrated into a serious enterprise data storage engine. (Most Prolog implementations are just text files and in-memory hash tables.)
I thought Datalog did get integrated into some "proper" storage engines. It is a subset of Prolog, and expressively more like SQL, and designed for query optimisation etc. It was certainly taken seriously for databases for quite a while.
> Logic is important and it isn't going away, but reality has too much uncertainty for simple logic to work in practice. The statistically grounded approach makes me happier, and again computer hardware's power is overwhelming the efforts of the logicians to tie everything down to certainties.
http://incompleteideas.net/IncIdeas/BitterLesson.html
takes this even further, suggesting that even the statistically grounded, human-enriched approaches are loosing way to simpler methods fuel by hardware advances.
I think we've got about 12 years of that left. There are three hardware waves coming:
1. In about 12 years we will have another "tick" of general purpose compute doubling. This will happen as sensible, known architectural changes and the slowed progress of Moores type hardware development come together.
2. We will have a wave of hetrogenous and specialist hardware architectures that will bump things along.
3. The hardware manufactures will crack and license / sell servers on a core/hour basis. This will allow people to burst compute on 100's or 1000's of cores without the always on capital commitment. This won't impact on the cloud providers but will provide on prem with a lease of life and allow people who can't migrate to the cloud due to legacy etc an escape hatch.
After that, I see a real choke on compute progress; expect a doubling every 50 years at best. The current "Moores" rate is 20 years, but that's based on the current investment fat industry. Once investors get there heads around the technology realities I expect all the cash to come out of chip making really quick. Innovation will crash stop.
At that point it's going to be software or bust for AI. I predict software...
> Prolog's lack of popularity suggests that viewing a computer program as a pure (first order predicate) logic construct isn't a powerful way of thinking in general. That is a bit of a blow to all the programmers who seem to secretly want to be mathematicians because that in turn suggests that the logical aspects of programming are subordinate to hardware realities.
Of course it's powerful. The problem is that it's far removed from, and in many cases in conflict with, how computers actually perform computation.
>> That this process of computation is difficult to grok is especially noticable when you try to debug a Prolog program. Computations get undone when attempts at satisfying a goal fail; other computations get retried down different branches resulting in different unifications and worse of all, the order in which you wrote your clauses in the program makes a difference to how it gets executed and, indeed, whether any part of the program is reachable.
Actually, Prolog's clause selection rule that relies on clause ordering in the database is a boon when it comes to understanding backtracking during debugging ("tracing", please). You know that if you have two clauses of the predicate p/2:
p(a).
p(b).
And you make the query:
?- p(A).
The interpreter will first find the result p(a) and then backtrack to p(b). You know the order in which choice points will be created, that is. This makes it infinitely easier to trace a Prolog program than in a hypothetical (and very impractical) "purely" declarative langauge where clause order wouldn't matter.
Now, tracing a complex program with lots of recursive calls- that can be difficult. But that's not because of backtracking. It's because of the way Prolog "unfolds" recursion, which is something I'd have trouble explaining even after ten ish years of coding in Prolog. It's something you have to develop a feeling for, after tracing a sufficient number of recursive programs. Now _that_ I'd agree is a difficulty that may keep programmers from using the language. But- backtracking? I don't agree.
That's not the fault of the language though. If you get taught, say, C# and you 're never taught how to debug it you'd have the same problem.
The four-port debugger takes some explaining, but it's not the end of the world. It's actually a very conceptually simple way to understand Prolog's execution model. It's a shame that it's not taught more often.
I've been reading so much about the death of symbolic AI and the complete victory of statistical approaches lately, that I suspect that a new symbolic hype-cycle is about to start.
I know this sounds counter-intuitive, but if you've been following such trends for a while, you might agree with me.
I think more pure logical languages have had an impact since Prolog, particularly variants of Datalog.
The thing with "Datalog" is that it is really a level of functionality that is implemented in various database query systems and not a well-defined language in and of itself. 15 years ago I remembered searching for papers about it and did not find so many, now it is hot.
The painful thing about Prolog, I think, is the mashup of declarative and imperative, it just doesn't come across as natural.
> The thing with "Datalog" is that it is really a level of functionality that is implemented in various database query systems and not a well-defined language in and of itself. 15 years ago I remembered searching for papers about it and did not find so many, now it is hot.
Datalog as a language is really just one very specific form of rules (first-order horn implications containing just constants and variables, where each variable in the conclusion also occurs in the premise), and every Datalog program (i.e., every set of rules) is guaranteed to have a finite, universal model.
[0] Ceri, Gottlob, Tanca. (1989) What you always wanted to know about Datalog (and never dared to ask). IEEE TRANSACTIONS KNOWLEDGE AND DATA ENGINEERING.
[1] Abiteboul, Hull, Vianu. (1994) Foundations of Databases: The Logical Level. Pearson.
Prolog is a terrible "programming language" - and I would never try to use it as such. But it is (at least conceptually) an excellent query language. In that regard, it is much closer to SQL, as is its domain of reasonably applicable problems.
Clarification: I dont actually think prolog is terrible (I credit it as the most exciting language I have ever learned), I just mean its not intuitive to "program" with in the imperative sense of telling a computer what to do. What I mean by it is excellent as a query language is - given a set of data, it is great for drawing conclusions from that data (but not in the same way as a "traditional" query language like SQL).
It is a declarative language like SQL. If you come from traditional imperative programming side, it is very difficult for you to get used to. I forgot who said it : "With imperative style, you code the precise steps to get what you want. With declarative style, you describe what you want and you let the language/runtime/OS/etc get it for you."
If you're used to C/C++/Java, then SQL or Prolog seems frustrating, unnatural and non-intuitive. But once you have the eureka moment and realize they are declarative, you appreciate the elegance and power of SQL or Prolog.
I like Prolog, I just think that trying to use it as a traditional programming language for an average task is like using a sponge to bail water out of a boat - you can do it, but it is hardly the most effective tool. :) That said, there are very specific problem domains in which Prolog excels.
Prolog is not a good language for querying databases (if that's what you mean by "query language"). Foe one thing, it lacks a SELECT statement or list comprehensions and the like. If you want all results of a goal, you have to use one of the bagof/setof/findall predicates, or roll your own.
On the other hand, Prolog programs are logic theories (as are Prolog queries) and their executio is a proof. The range of programs that can be expressed in Prolog is the set of programs that are computable by a Turing machine. So yes, Prolog is a programming language. Whether it's "terrible" or not is up to personal taste.
I mean, I don't konw of an objective measure of what makes a programming language "terrible".
Software has evolved on a direction where sending sets around is natural. That is no evidence that sending theorems around is problematic, it's just that current communication implementations happens to be very data-friendly and logic-unfriendly.
There is probably a very human reason for that, but mathematically both representations are perfectly replaceable.
I've used Datalog (which is essentially a stripped down version of Prolog) in the past for a pointer analysis. Essentially, you literally print out the inference rules and then the basic facts to a text stream, and then you use interactive queries on the text stream.
You might look at how datomic or datascript use a limited subset of prolog (datalog, with a lispy syntax) to express queries: you end up doing things like:
[[?person :name “Bob”]
[?person :state :tx]]
To query for “the person named Bob who lives in Texas”
You might find https://github.com/graknlabs/grakn of interest. Datalog was one of our main sources of inspiration. The query mentioned in this thread (the person named Bob who lives in Texas) would be something like:
$x has name 'Bob';
($x, $y) isa lives; $y has name 'tx';
Programming is about building abstraction upon abstraction.
If you don't understand the abstractions upon which you're building, you'll have trouble building upon them.
Unification and backtracking take some effort to grok. If you grok them you can put them to use in a clean and efficient way. If not then Prolog remains a mystery.
I think the complaint given about Prolog is about as applicable to SQL, which is widely used. Oddly, I could imagine it being easier to use as a query language than SQL in some circumstances.
> ...by far the biggest cognitive problem that they have with this language is understanding what the interpreter is doing at any point in time. Prolog’s attempt at being declarative ... is the problem: how to get a computer to do something without telling it what to do?
I mean, people use SQL just fine without understanding how the DB is going to accomplish their queries.
Maybe Prolog just doesn't have the same level of tooling as SQL for deducing "what's going to happen", e.g. an equivalent to SQL's `EXPLAIN ANALYZE`?
> The traversal of a search space in which choice-points are introduced whenever multiple clauses match the current computational goal and a process of (possibly partial) variable instantiation ... and worse of all, the order in which you wrote your clauses in the program makes a difference to how it gets executed and, indeed, whether any part of the program is reachable.
Some people (more than use Prolog) use Erlang—even the parts like chained binary pattern-matching†—just fine. And some people (still more than use Prolog) use the MLs just fine, too, including functional combinators and passing around monadic bindings, despite this playing utter hell on determining "whether any part of the program is reachable."
† `foo(<<A/32,B/A,Rest/binary>>)` — an Erlang clause-head which takes a binary string, and attempts to unify the variable A with the first four bytes of it, and B with the next variable-A bytes of it, and Rest with, well, the rest of it. I.e. A is taken as a uint32 and used to calculate the bounds of a slice on B, all during the attempt to pick a clause to execute. This is common, idiomatic code.
> That this process of computation is difficult to grok is especially noticable when you try to debug a Prolog program. Computations get undone when attempts at satisfying a goal fail...
People write Solidity code for the Ethereum VM just fine. (In fact, this one is kind of hilarious; Prolog is less popular than even an arcane programming environment like the EVM—where all function calls are implicitly nested MVCC transactions that roll back any side-effects upon their Turing-machine substrate upon any trap or fault, including even rolling back the emission of logging statements and the reservation or nullification of memory.)
Declarative programming in general is notorious for being difficult to debug. SQL is the only language I can think of where this notoriety doesn't come up, and I'm honestly not sure if that's merely because I've never interacted with anyone who has had to deal with sufficiently complex queries. But the insanity that's involved with complex regular expressions, parser generators, even Python's decorator system in complex declarative projects (e.g., some build system tooling I've worked on) is commonly complained about.
The standard 'debugging' problem in SQL is query optimization, namely figuring out how to make a query return in seconds instead of a week. Many engineers simply can't solve it, so a typical solution is to hand it over to a DBA and go have coffee.
I've never interacted with anyone who has had to deal with sufficiently complex queries
This might be a big part of it. SQL is a limited domain, with limited tools available. Doing "general purpose" programming with SQL requires advanced trickery and is implementation-dependent.
Another example might be Excel, which has a declarative interface (i.e. a GUI) and a semi-declarative formula syntax.
The difference between your Erlang example and Prolog is that Prolog "throws away" computation. The whole predicate has to match, not just the first pattern. This difference leads to very hard to predict control flow.
Theoretically, Erlang does this as well; it just restricts the set of functions you can use in a clause-head to a language-defined set† of pure (and almost entirely O(1)‡) functions, such that you don't really worry about how much or little computation is going on during clause-head unification.
Ah, sorry, yeah. I did a few good Erlang projects years ago, and am an Elixir developer now, but I never used the binary pattern matching syntax while I was writing Erlang, only needing to learn it once I was already over in Elixir-land, so I was trying to map backwards from the Elixir variation on the syntax.
>"worse of all, the order in which you wrote your clauses in the program makes a difference to how it gets executed and, indeed, whether any part of the program is reachable"
That's because the algorithm that prolog uses to do unification uses committed choice - if the logic could be run using efficiently grounded answer sets then the behaviour could be made consistent and that would make the semantics a whole lot clearer. Especially if ! was done away with as well.
I agree that "easier to understand" could've helped. But things that are hard to understand can get uptake if they're big wins. The bigger problem in my opinion is that the number of cases where Prolog was a big win significantly decreased over time. When it appeared in the '70s, Prolog's declarative-programming approach based on logic had very few peers where you could do even simple textbook examples in as nice a way. But now even SQL (with features like recursive queries) can do a lot of the intro-level Prolog examples. It doesn't have the full logic semantics with unification, but a lot of problems don't need them. SMT solvers, LINQ, and rules engines like Jess/Drools are a few other declarative paradigms that ended up eating into some of what Prolog proponents once saw as its space. If Prolog were up exclusively against FORTRAN77 or K&R C, there would be many problems where it's a big win, but that's not the competition anymore.