And changes it to this with a consolidated .PHONY, but still no install target. Surely it should complain “there is no install target, what are you marking it as phoney³ for?”
I will not say that this way of writing it is unreasonable; users unfamiliar with the available targets can benefit from such a thing up the top (though in practice there’s normally a block of other stuff first, and other ways of achieving it). But this is also a reasonable way of writing it, with strong advantages of its own:
Treating .PHONY as more an inline attribute like this rather than a target of its own makes it much more obvious that the install target is missing, and makes desynchronisation of .PHONY and the actual targets much less likely.
—⁂—
¹ The first atrocity is that it’s not even a legal makefile: barring a .RECIPEPREFIX override, recipes must be indented by one tab. Running make with this file produces “Makefile:10: ** missing separator. Stop.” Taking invalid code and making it valid is not the domain of a linter or formatter. The Markdown is also abysmal because it doesn’t use tabs, but rather a single space in the output. Given that tabs are structural in makefiles, this is an astonishing lapse.
The second atrocity is the mismatching of .PHONY targets, despite joining the lines (by not having a blank line after the .PHONY line), which is completely unrealistic. Or at least I find it so.
² I have normalised whitespace so we can focus on the .PHONY changes. Assume tabs are where they need to be. Unfortunately HN hates interesting whitespace.
³ Ah, the number of times I’ve written .PHONEY… stupid americentric software.
A while back I attended an open-source conference (which was a lot of fun). After the presentations, people would "set up shop" at tables and jam on whatever was their fancy.
One evening there was a person using make as a SAT solver[0]. That blew my mind to be honest. I had used make for years as a build tool and never thought of it in that problem space.
This memory isn't relevant to this project. I was just reminded of the experience is all.
It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.
Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.
Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.
Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.
So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.
I love python, have used it for years. I hate the dependency and multiple interpreter situation.
A great PL should stand on its own without the need for external tooling.
At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.
I use python without any dependencies on web servers. Pip is cool, but you don't need to get pulled into the node-like dependecy hell.
For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.
If you had to use sqlite without library, you can trivially call the c api from python directly with the ctypes builtin (or compile a python module with c api)
> It’s almost comical to see “why Python” comments ... Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But ... Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts.
I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?
Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...
This point has been pummeled to death for decades. Before Python, people did the same with Ruby and “gem.” Literally nothing is new here.
One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.
Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.
Let me be even more explicit: if your installation instructions are `pip install ...` -- or `npm install ...` for that matter -- then you are automatically excluding a super-majority of potential users.
I don’t even write python these days. I just wrote my own version of a terminal llm-caller[^1] in Go for this exact same reason.
There’s a famous one that does the same thing but is written in Python. So it has its issues.
My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.
some Makefiles use indents or var placement as semantic cues. if a tool rewrites them mechanically, it might clean things while killing meaning. is structural correctness enough, or do we need formatters that preserve human context too?
Ideally, we'd have linters that preserve the human context as well. But human context may be too ambiguous and high variance enough that it can be impractical.
It's hard to say what's intent and what not, maybe linters with many custom rules would work best.
> The implicit rule search (see Using Implicit Rules) is skipped for .PHONY targets. This is why declaring a target as .PHONY is good for performance, even if you are not worried about the actual file existing.
Makefile is “make file“. It has been abused into being a task runner. The wrong tool for the job.
The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.
What gives make a bad name is the same thing that gave javascript or m4 a bad name - these things are their own exotic birds - doing them well require new concepts and new behaviors.
You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.
See also forth, dc, awk, jq ...
It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.
Also see supercollider, prolog, haskell, apl...
I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.
I’m confused. Are you saying that python is less exotic than javascript, jq, awk, m4, haskell, lisp, dc, prolog, apl, and supercollider therefore it’s bad, but the least bad out of these?
It may suck for it, but it’s better than a collection of random scripts and commands baked into CI configuration that evolve to become unrunnable in normal dev environments.
To be added to POSIX usually requires common existing availability of a feature, if not literally at least substantially similar and not too difficult to add a POSIX-compliant mode. Fortunately this has become easier to achieve now that most of the more esoteric Unix-like systems have died off.
POSIX-2024 added at least three new utilities: readlink, realpath, and timeout.[1] (It also added the gettext localization framework, which includes new C APIs and a few additional shell utilities.) Many utilities gained new flags and features, including Make. For example, POSIX Make now supports "!=" variable assignment for capturing shell command invocations, as well as guaranteed computed macro name evaluation (i.e. `$($(FOO).$(BAR))`). The latter actual gives a standards compliant way to implement conditionals, and interestingly had already been supported on all common Make implementations for a very long time.
These days anybody can meaningfully participate in POSIX. You can get started by checking out https://austingroupbugs.net/, where both errata and suggestions for new features can be added. There's also a mailing-list you can join, though most of the discourse happens in the ticket tracker. But please don't spam new feature requests, a la GitHub drive-bys. Spend a good amount of time understanding the specification and existing review processes, e.g. by reading alot of ticket discussions.
Why not Python? I primarily program in C++ but I see it as a decent choice as Python is available in almost all recent machines. Of course Windows is a notable exception but given it's a tool for developers I guess Python should be present.
The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...
In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.
(I'm tempted to try rewriting it in Rust with AI.)
Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.
And in this case--a linter--performance is almost certainly never an issue.
Then remove it? There's always tradeoffs adding tooling - I'm assuming you have it in your workflow to catch downstream issues because it saves more time in the long run.
It definitely is a problem when the tool you're going to use a few times a week takes an extra hundred milliseconds compared to a native solution. Especially when you need to process huge data files like hand crafted makefiles. I can totally feel your pain - extra effort would've been made to avoid that at the cost of development speed. /s
I find that writing anything substantially complex in python sacrifices the development speed. That isn't its strong suit. It's that a lot of people want to write their code in it by preference.
Yeah if only it was an extra 100 milliseconds a few times a week. We have yamllint (also written in Python) in our pre-commit (also written in Python) and it definitely adds a second or two.
`pip install ...` is not a reliable or appropriate mechanism for distribution of any kind of tool like this one. Table stakes is pre-compiled architecture-specific binaries.
I would've chosen Java because it's faster than Python and is good for string manipulation. My cousin would've chosen Brainfuck because he's really good at it. Alas, this discussion is useless because none of us are the one who spent the effort to write the Makefile formatter and linter, we can only bikeshed all day about what decisions we would've taken
For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.". Perl is way better to do string processing in than Java, too, FWIW.
My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.
I am not saying it is weird, I was just responding to parent with regarding to "string manipulation", and someone mentioned "performance", so I stated two facts about Perl.
I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.
It really is though. When I was on the journey to find the fastest interpreted scripting languages, Perl and LuaJIT were one of the fastest ones, meaning Python is slower than both of these languages.
It does not invalidate anything I have said, and this reasoning of yours is so flawed.
For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.
All that said, where did I criticize? I did not criticize anything, at all.
I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.
I do not care about movie makers and singers in general, and for the most part I do not have direct contact with them, so it would be futile to offer any advice. I did offer advice to a couple of singers before though. What is your point besides being unnecessarily defensive over two simple, stated facts? As I said, it was not a criticism of the author or the project, it was a response to your comment. Since this is going to lead nowhere, I am going to stop responding to this thread.
Remember, you cannot criticize (even though it was not it) unless you have something to show up! Next time someone provides a critique to an article, we have to make sure to let them know it is wrong to criticize unless they have written an article themselves on the same topic.
FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.
What opinions? Perl is faster than Python regardless of my opinion, and it is great for string manipulation, which again, is not a matter of opinion. It really is the case.
None of it has anything to do with the author choosing Python. How many times do I need to repeat myself?
You are the one who are being extremely defensive about it and making it seem like I give a damn about whether or not this project is written in Python. I told you I do not care. I was responding to your comment. No matter how much you insist on my reply (to you) being a criticism of the author or his project, it will not make it so.
This is incoherent no matter what two programming languages you are talking about. It's like trying to say the road is fast, as opposed to the car.
But in practice, for anything where performance matters, Perl and Python are effectively just convenient ways to wrap a C inner loop anyway. I don't just mean gluing to some third-party library, either. I mean something inside the interpreter. Once you've built the state machine for the regex, the language bytecode doesn't matter any more.
Which is why it's dozens of times slower in both Python and Perl to run a do-nothing for loop than to iterate through a string with a dummy regex, or to use built-ins to search for a character, etc. (Or to create the string in the first place.)
> it is great for string manipulation, which again, is not a matter of opinion
This is incoherent no matter what programming language you are talking about. Of course it's a matter of opinion. Other people aren't going to like the same syntax you do.
> This is incoherent no matter what two programming languages you are talking about.
Yet everyone talks about performance when it comes to programming languages. Why? And what does coherence have anything to do with it?
Plus I explained what I meant by that, there is a huge difference in the startup time of Perl and Python, for one, and you can check benchmarks. The implementation of the two languages are not identical. If they are not identical, there must be a difference in performance one way or another, I assume.
> This is incoherent no matter what programming language you are talking about. Of course it's a matter of opinion. Other people aren't going to like the same syntax you do.
I do not think anyone is going to argue that Brainfuck was made with easy string manipulation in mind, or Forth.
The point is that Perl helps you manipulate strings with relative ease. Assuming you know the language, yeah.
In fact, please read other comments, because I have already said this: "For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.".". It is easy to do so in Python, but even easier (and faster) to do on Perl. In both cases there is an assumption of knowing the language, and I thought that goes without saying.
I said nothing about liking the syntax or whatever you are trying to read into it. It is easy to do string manipulation, regardless of your like or dislike of the syntax. Thus, not an opinion.
Oh you're still here, I thought you were done 5 times already. All I said was "it is easy to do string manipulation with", not it's the absolute best thing ever and the fastest at everything!
> Yet everyone talks about performance when it comes to programming languages. Why?
I know this may come as a shock to the average HN reader, but most people do not think very clearly. Even programmers. And among those who do, it is quite rare that people speak precisely without putting in considerable effort.
> but even easier (and faster) to do on Perl.
You have no objective way to substantiate this claim (because such cannot exist), therefore it is in fact a matter of opinion.
Oh but there is an objective way to substantiate this claim. How familiar are you with Perl? It will not work (to convince you) if you do not know the language, you need to be quite familiar with it. There are extremely short ways to do string manipulation, for one, think 3 characters short. So, assuming you know Perl, it is definitely easier AND faster, than say, Python.
Trivial example:
Replace all numbers in a string (faster and easier in Perl):
Perl:
$string =~ s/\d+/NUM/g;
Python:
import re
string = re.sub(r'\d+', 'NUM', string)
Perl's design history is text processing as a first-class goal. It has syntax-level efficiency, and expressive power (e.g. s///e, tr///, inline regex flags, etc.). Do you want me to go on? These are all objective ways to substantiate the claims I made.
Perl is implemented in C with a runtime that aggressively optimizes text processing. The core primitives such as split, join, substr, index, s///, etc. are tight wrappers around C functions with minimal object overhead. In Python, even simple string manipulations go through Python's object system, and in Python strings are immutable, so every manipulation creates a new string object, which incurs extra memory allocation and garbage collection.
Perl is typically 30-80% faster in similar regex-heavy workloads because there is no import overhead, there is internal opcode dispatch for regex, and no allocation of new strings until match is confirmed.
Additionally, Perl frequently wins by eliminating boilerplate, as shown above. It has advanced features with simple syntax, too, meaning you get fewer lines of code, fewer concepts to juggle, and faster iteration time. Python, by contrast, forces the developer to switch between the re module for non-trivial regex, multiple string method calls, and so forth.
Shall I go on? I can objectively substantiate my claims with regarding to Perl vs. Python, as I have done above, but if it is not enough for you, I could easily go on. I can give you benchmarks, too, if you so wish. FWIW, my claim was: you can achieve string manipulation / text processing faster (both execution speed and development time) and easier in Perl vs. Python.
---
The bottom line is, that for text-heavy workloads, Perl remains the more efficient and developer-friendly tool as opposed to Python; a conclusion that can be quantified, measured and defended objectively. It is not an opinion, nor a matter of taste.
Just to reiterate, across a wide range of text processing tasks, Perl consistently offers:
- Fewer lines of code
- More powerful and concise idioms
- Better CLI support
- Faster execution for regex-heavy operations
- Less boilerplate and setup overhead
This list is non-exhaustive, and I can get more into the implementation details (I did touch it above), too, if you so wish, such as, for example, how Perl's interpreter is string-first and how Perl uses SVs that are internally optimized to handle both numbers and strings interchangeably without conversion overhead, as opposed to Python that uses PyObject-based immutable strings which incur copy-on-write penalties and method dispatch overhead.
Make allows you to specify dependencies for you targets, which are also targets. As such you do not need to rely on brittle string concatenation approaches. It is a tool build for this.
I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.
I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.
I'm not so sure most people would agree with you. Though I think plenty would.
I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.
But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.
Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.
Unless your company forces you to use Windows, which is still much more common than many would like to admit. And yes, WSL exists, but in my experience, if a company is unwilling to allow macOS, there’s a good chance they either don’t allow enabling HyperV, or the security software they use is such garbage that it results in a HyperV enabled system being effectively unusable.
> Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.
Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.
I wonder: which tool do you believe does this better than Make?
I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.
Abuse? Runnig linters, code analysers, configuration tools, template engines, spellcheckers, pulling dependencies, building dependencies with different build systems.
Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.
Dunno, there are other aspects of environment variables that deteriorate the dev experience. They're very conducive to spooky action at a distance, since they're silently being passed along from parent-process to child-process (except when they aren't).
They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).
You don't have to write Make invocations by hand... It's just a tool that can be called from any editor or IDE (or by automatic file watchers). Environment variables aren't really relevant to Make either, unless you really want to misuse it as a command runner.
It's 80% of what you want and it's installed everywhere.
You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.
I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.
Yep, that's how I used it on the job before. "make test" would run tests locally and in CI pipeline, keeping the CI file refreshingly short at that point.
I often prefer to work in in extremis environments where there is no internet access, and hence, no easy way to get ahold of make; it's given me a bad habit of just waiting a build.bash script to do what make does most of the time. I haven't really found myself missing it that much.
If you can install bash on your airgapped dev box, why wouldn't you install make on it? Make is part of the core dev environment on just about every disto under the sun.
I am confused, because this means that you won't be able to install anything. No compiler, no 3rd party libraries and no text editor that isn't preinstalled
reply