Hacker News new | past | comments | ask | show | jobs | submit login
Mbake – A Makefile formatter and linter, that only took 50 years (github.com/ebodshojaei)
224 points by rainmans 2 days ago | hide | past | favorite | 113 comments





I am quite certain to have used such kind of tooling during 1990's, with ads on The C/C++ Users Journal and Dr. Dobbs developer magazines.

It wouldn't have been Matthias Andrée's makel, then.

* https://git.maandree.se/makel

Or unmake.

* https://crates.io/crates/unmake

Or checkmake.

* https://github.com/checkmake/checkmake (https://news.ycombinator.com/item?id=32460375)

Or make-audit.

* https://github.com/david-a-wheeler/make-audit

Or the Sublime linter for makefiles.

* https://github.com/giampaolo/SublimeLinter-contrib-makefile

It hasn't taken quite the 50 years that we are told, has it? (-:


Consolidation of .PHONY targets is an anti-feature, the .PHONY decl is supposed to be adjacent to its target...

Possible opinion detected. Explain why.

Look at https://github.com/EbodShojaei/bake#basic-formatting. It starts with a horribly contrived example¹ including this²:

  .PHONY: clean
  all: $(TARGET)
      $(CC) $(CFLAGS) -o $@ $^

  .PHONY: install
  clean:
      rm -f *.o
And changes it to this with a consolidated .PHONY, but still no install target. Surely it should complain “there is no install target, what are you marking it as phoney³ for?”

  .PHONY: all clean install

  all: $(TARGET)
      $(CC) $(CFLAGS) -o $@ $^

  clean:
      rm -f *.o
I will not say that this way of writing it is unreasonable; users unfamiliar with the available targets can benefit from such a thing up the top (though in practice there’s normally a block of other stuff first, and other ways of achieving it). But this is also a reasonable way of writing it, with strong advantages of its own:

  .PHONY: all
  all: $(TARGET)
      $(CC) $(CFLAGS) -o $@ $^

  .PHONY: clean
  clean:
      rm -f *.o

  .PHONY: install
  # Um… where’s the `install:` line?
Treating .PHONY as more an inline attribute like this rather than a target of its own makes it much more obvious that the install target is missing, and makes desynchronisation of .PHONY and the actual targets much less likely.

—⁂—

¹ The first atrocity is that it’s not even a legal makefile: barring a .RECIPEPREFIX override, recipes must be indented by one tab. Running make with this file produces “Makefile:10: ** missing separator. Stop.” Taking invalid code and making it valid is not the domain of a linter or formatter. The Markdown is also abysmal because it doesn’t use tabs, but rather a single space in the output. Given that tabs are structural in makefiles, this is an astonishing lapse.

The second atrocity is the mismatching of .PHONY targets, despite joining the lines (by not having a blank line after the .PHONY line), which is completely unrealistic. Or at least I find it so.

² I have normalised whitespace so we can focus on the .PHONY changes. Assume tabs are where they need to be. Unfortunately HN hates interesting whitespace.

³ Ah, the number of times I’ve written .PHONEY… stupid americentric software.


Thanks for taking the time ! I had to write some demos out to test, but you are thoroughly correct.

Nice, would be good to package this as a https://pre-commit.com/ hook.

Speaking of make...

A while back I attended an open-source conference (which was a lot of fun). After the presentations, people would "set up shop" at tables and jam on whatever was their fancy.

One evening there was a person using make as a SAT solver[0]. That blew my mind to be honest. I had used make for years as a build tool and never thought of it in that problem space.

This memory isn't relevant to this project. I was just reminded of the experience is all.

0 - https://en.wikipedia.org/wiki/SAT_solver


Thanks for the tool. This is pretty neat.

It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.

Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.

Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.

Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.

So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.


I love python, have used it for years. I hate the dependency and multiple interpreter situation.

A great PL should stand on its own without the need for external tooling.

At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.

Some more thoughts: http://calvinlc.com/p/2025/06/10/thank-you-and-goodbye-pytho...


I use python without any dependencies on web servers. Pip is cool, but you don't need to get pulled into the node-like dependecy hell.

For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.


How do you use only sockets to talk to sqlite?

Why do you need to do that when Python has built in sqlite3 support?

Right, my bad, I was thinking of psql or mysql.

If you had to use sqlite without library, you can trivially call the c api from python directly with the ctypes builtin (or compile a python module with c api)


> It’s almost comical to see “why Python” comments ... Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But ... Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts.

I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?

Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...


This point has been pummeled to death for decades. Before Python, people did the same with Ruby and “gem.” Literally nothing is new here.

One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.

Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.


Let me be even more explicit: if your installation instructions are `pip install ...` -- or `npm install ...` for that matter -- then you are automatically excluding a super-majority of potential users.

I don’t even write python these days. I just wrote my own version of a terminal llm-caller[^1] in Go for this exact same reason.

There’s a famous one that does the same thing but is written in Python. So it has its issues.

My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.

[1]: https://github.com/rednafi/q


You only have to install pip once. It’s a one-time set and forget operation. And with conda, it’s even easier with just click and install.

it’s harder to distribute software written in python via eg package manager compared to compilable languages.

Does this support inline ignoring specific rules with some syntax? Couldn’t find this from the README. Would be good to have as an escape hatch.

some Makefiles use indents or var placement as semantic cues. if a tool rewrites them mechanically, it might clean things while killing meaning. is structural correctness enough, or do we need formatters that preserve human context too?

Ideally, we'd have linters that preserve the human context as well. But human context may be too ambiguous and high variance enough that it can be impractical.

It's hard to say what's intent and what not, maybe linters with many custom rules would work best.


That doesn't sound any different than it is for any other programming language, but many people prefer automatic formatting anyways.

It really doesn’t have to be complicated for it to be useful. Plenty thanks for sharing this.

Anyone else not bother maintaining a list of .PHONY targets? Always felt like a chore that adds noise for a rare edge case.

> The implicit rule search (see Using Implicit Rules) is skipped for .PHONY targets. This is why declaring a target as .PHONY is good for performance, even if you are not worried about the actual file existing.

https://www.gnu.org/software/make/manual/html_node/Phony-Tar...



ewwww. consolidated phony lines. everyone knows these should be right before each rule declaration.

Seems like from the README this can be disabled

  group_phony_declarations = false

I think the visual clutter of .PHONY on each recipe declaration is better since there’s always a lot of copy-paste coding.

Differences to checkmake, the older Makefile's linter and formatter?

If only it accepted POSIX syntax...

Makefile is “make file“. It has been abused into being a task runner. The wrong tool for the job.

The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.


Right. It kinda sucks for that purpose too, which gives Make a bad name.

What gives make a bad name is the same thing that gave javascript or m4 a bad name - these things are their own exotic birds - doing them well require new concepts and new behaviors.

You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.

See also forth, dc, awk, jq ...

It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.

Also see supercollider, prolog, haskell, apl...

I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.


I’m confused. Are you saying that python is less exotic than javascript, jq, awk, m4, haskell, lisp, dc, prolog, apl, and supercollider therefore it’s bad, but the least bad out of these?

Forth it's far easier than dc.

On Lisp, exotic? it's damn easy. Haskell it's far worse.


It may suck for it, but it’s better than a collection of random scripts and commands baked into CI configuration that evolve to become unrunnable in normal dev environments.

The nice thing about make is that is ubiquitous, and that it offers nice things out of the box.

This is so unfortunate about these ubiquitous tools.

At a certain point we seem to have stopped adding tools to POSIX.

Does anyone know why exactly?


To be added to POSIX usually requires common existing availability of a feature, if not literally at least substantially similar and not too difficult to add a POSIX-compliant mode. Fortunately this has become easier to achieve now that most of the more esoteric Unix-like systems have died off.

POSIX-2024 added at least three new utilities: readlink, realpath, and timeout.[1] (It also added the gettext localization framework, which includes new C APIs and a few additional shell utilities.) Many utilities gained new flags and features, including Make. For example, POSIX Make now supports "!=" variable assignment for capturing shell command invocations, as well as guaranteed computed macro name evaluation (i.e. `$($(FOO).$(BAR))`). The latter actual gives a standards compliant way to implement conditionals, and interestingly had already been supported on all common Make implementations for a very long time.

These days anybody can meaningfully participate in POSIX. You can get started by checking out https://austingroupbugs.net/, where both errata and suggestions for new features can be added. There's also a mailing-list you can join, though most of the discourse happens in the ticket tracker. But please don't spam new feature requests, a la GitHub drive-bys. Spend a good amount of time understanding the specification and existing review processes, e.g. by reading alot of ticket discussions.

[1] See https://sortix.org/blog/posix-2024/


It works fine for PHONY targets.

But most people don’t realize in many cases they can do better than that.


ive always wanted this. im going to give it a go!

does this happen to support IDE like vscode?


From the readme:

  VSCode Extension
    1. Open VSCode
    2. Go to Extensions (Ctrl+Shift+X)
    3. Search for "mbake Makefile Formatter"
    4. Click Install

thanks! apologies i was on mobile and missed this. im excited to try it out

simple use rake

or any other task runner

[flagged]


Why not Python? I primarily program in C++ but I see it as a decent choice as Python is available in almost all recent machines. Of course Windows is a notable exception but given it's a tool for developers I guess Python should be present.

1. Terrible performance.

2. Terrible installation UX.

The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...

In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.

(I'm tempted to try rewriting it in Rust with AI.)


> 1. Terrible performance

Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.

And in this case--a linter--performance is almost certainly never an issue.


The only thing computers do is compute and disk.

Performance only matters if you care about performance, and I do care about performance. If you don't, fine I guess.


Then remove it? There's always tradeoffs adding tooling - I'm assuming you have it in your workflow to catch downstream issues because it saves more time in the long run.

It definitely is a problem when the tool you're going to use a few times a week takes an extra hundred milliseconds compared to a native solution. Especially when you need to process huge data files like hand crafted makefiles. I can totally feel your pain - extra effort would've been made to avoid that at the cost of development speed. /s

I find that writing anything substantially complex in python sacrifices the development speed. That isn't its strong suit. It's that a lot of people want to write their code in it by preference.

Yeah if only it was an extra 100 milliseconds a few times a week. We have yamllint (also written in Python) in our pre-commit (also written in Python) and it definitely adds a second or two.

Also format-on-save is a common workflow.


Terrible portability across platforms specially with dependencies.

`pip install ...` is not a reliable or appropriate mechanism for distribution of any kind of tool like this one. Table stakes is pre-compiled architecture-specific binaries.

Presumably because the author is comfortable with python and it is easy to do string manipulation with.

Perl is much faster than Python, and it is especially good for string manipulation. Thus, I would have chosen Perl.

I would've chosen Java because it's faster than Python and is good for string manipulation. My cousin would've chosen Brainfuck because he's really good at it. Alas, this discussion is useless because none of us are the one who spent the effort to write the Makefile formatter and linter, we can only bikeshed all day about what decisions we would've taken

For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.". Perl is way better to do string processing in than Java, too, FWIW.

My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.


Almost everybody knows python these days, almost nobody knows Perl. It's not weird that OP chose a language they already knew.

I am not saying it is weird, I was just responding to parent with regarding to "string manipulation", and someone mentioned "performance", so I stated two facts about Perl.

I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.


Perl is absolutely not faster than Python, not even for regexes.

It really is though. When I was on the journey to find the fastest interpreted scripting languages, Perl and LuaJIT were one of the fastest ones, meaning Python is slower than both of these languages.

Perl is always faster than python. Python is just absurdly bad in performance

Would you be willing to share the link to your repo?

I have not written a Makefile formatter and linker in any languages.

Yes, I thought so. It's much easier to criticize than to create.

It does not invalidate anything I have said, and this reasoning of yours is so flawed.

For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.

All that said, where did I criticize? I did not criticize anything, at all.

I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.


Do you regularly offer your directorial opinions to movie makers and singers? Or is it just programmers?

I do not care about movie makers and singers in general, and for the most part I do not have direct contact with them, so it would be futile to offer any advice. I did offer advice to a couple of singers before though. What is your point besides being unnecessarily defensive over two simple, stated facts? As I said, it was not a criticism of the author or the project, it was a response to your comment. Since this is going to lead nowhere, I am going to stop responding to this thread.

I will remember that the next time some voices an opinion at the sports bar.

Remember, you cannot criticize (even though it was not it) unless you have something to show up! Next time someone provides a critique to an article, we have to make sure to let them know it is wrong to criticize unless they have written an article themselves on the same topic.

FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.


For somebody who does not mind that the author chose python, you defended your opinions robustly.

What opinions? Perl is faster than Python regardless of my opinion, and it is great for string manipulation, which again, is not a matter of opinion. It really is the case.

None of it has anything to do with the author choosing Python. How many times do I need to repeat myself?

You are the one who are being extremely defensive about it and making it seem like I give a damn about whether or not this project is written in Python. I told you I do not care. I was responding to your comment. No matter how much you insist on my reply (to you) being a criticism of the author or his project, it will not make it so.

I am done talking to you, you are seeing ghosts.


> is faster than

This is incoherent no matter what two programming languages you are talking about. It's like trying to say the road is fast, as opposed to the car.

But in practice, for anything where performance matters, Perl and Python are effectively just convenient ways to wrap a C inner loop anyway. I don't just mean gluing to some third-party library, either. I mean something inside the interpreter. Once you've built the state machine for the regex, the language bytecode doesn't matter any more.

Which is why it's dozens of times slower in both Python and Perl to run a do-nothing for loop than to iterate through a string with a dummy regex, or to use built-ins to search for a character, etc. (Or to create the string in the first place.)

> it is great for string manipulation, which again, is not a matter of opinion

This is incoherent no matter what programming language you are talking about. Of course it's a matter of opinion. Other people aren't going to like the same syntax you do.


> This is incoherent no matter what two programming languages you are talking about.

Yet everyone talks about performance when it comes to programming languages. Why? And what does coherence have anything to do with it?

Plus I explained what I meant by that, there is a huge difference in the startup time of Perl and Python, for one, and you can check benchmarks. The implementation of the two languages are not identical. If they are not identical, there must be a difference in performance one way or another, I assume.

> This is incoherent no matter what programming language you are talking about. Of course it's a matter of opinion. Other people aren't going to like the same syntax you do.

I do not think anyone is going to argue that Brainfuck was made with easy string manipulation in mind, or Forth.

The point is that Perl helps you manipulate strings with relative ease. Assuming you know the language, yeah.

In fact, please read other comments, because I have already said this: "For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.".". It is easy to do so in Python, but even easier (and faster) to do on Perl. In both cases there is an assumption of knowing the language, and I thought that goes without saying.

I said nothing about liking the syntax or whatever you are trying to read into it. It is easy to do string manipulation, regardless of your like or dislike of the syntax. Thus, not an opinion.


Oh you're still here, I thought you were done 5 times already. All I said was "it is easy to do string manipulation with", not it's the absolute best thing ever and the fastest at everything!

Yeah, you are everywhere, too. I am done talking to you, however, so this was the last response you have gotten from me.

Have fun reading https://news.ycombinator.com/item?id=44359539.


> Yet everyone talks about performance when it comes to programming languages. Why?

I know this may come as a shock to the average HN reader, but most people do not think very clearly. Even programmers. And among those who do, it is quite rare that people speak precisely without putting in considerable effort.

> but even easier (and faster) to do on Perl.

You have no objective way to substantiate this claim (because such cannot exist), therefore it is in fact a matter of opinion.


Oh but there is an objective way to substantiate this claim. How familiar are you with Perl? It will not work (to convince you) if you do not know the language, you need to be quite familiar with it. There are extremely short ways to do string manipulation, for one, think 3 characters short. So, assuming you know Perl, it is definitely easier AND faster, than say, Python.

Trivial example:

Replace all numbers in a string (faster and easier in Perl):

Perl:

  $string =~ s/\d+/NUM/g;
Python:

  import re
  string = re.sub(r'\d+', 'NUM', string)
Perl's design history is text processing as a first-class goal. It has syntax-level efficiency, and expressive power (e.g. s///e, tr///, inline regex flags, etc.). Do you want me to go on? These are all objective ways to substantiate the claims I made.

Perl is implemented in C with a runtime that aggressively optimizes text processing. The core primitives such as split, join, substr, index, s///, etc. are tight wrappers around C functions with minimal object overhead. In Python, even simple string manipulations go through Python's object system, and in Python strings are immutable, so every manipulation creates a new string object, which incurs extra memory allocation and garbage collection.

Perl is typically 30-80% faster in similar regex-heavy workloads because there is no import overhead, there is internal opcode dispatch for regex, and no allocation of new strings until match is confirmed.

Additionally, Perl frequently wins by eliminating boilerplate, as shown above. It has advanced features with simple syntax, too, meaning you get fewer lines of code, fewer concepts to juggle, and faster iteration time. Python, by contrast, forces the developer to switch between the re module for non-trivial regex, multiple string method calls, and so forth.

Shall I go on? I can objectively substantiate my claims with regarding to Perl vs. Python, as I have done above, but if it is not enough for you, I could easily go on. I can give you benchmarks, too, if you so wish. FWIW, my claim was: you can achieve string manipulation / text processing faster (both execution speed and development time) and easier in Perl vs. Python.

---

The bottom line is, that for text-heavy workloads, Perl remains the more efficient and developer-friendly tool as opposed to Python; a conclusion that can be quantified, measured and defended objectively. It is not an opinion, nor a matter of taste.

Just to reiterate, across a wide range of text processing tasks, Perl consistently offers:

- Fewer lines of code

- More powerful and concise idioms

- Better CLI support

- Faster execution for regex-heavy operations

- Less boilerplate and setup overhead

This list is non-exhaustive, and I can get more into the implementation details (I did touch it above), too, if you so wish, such as, for example, how Perl's interpreter is string-first and how Perl uses SVs that are internally optimized to handle both numbers and strings interchangeably without conversion overhead, as opposed to Python that uses PyObject-based immutable strings which incur copy-on-write penalties and method dispatch overhead.


I don't understand why people like make so much. Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.

Make allows you to specify dependencies for you targets, which are also targets. As such you do not need to rely on brittle string concatenation approaches. It is a tool build for this.

I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.

I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.


I'm not so sure most people would agree with you. Though I think plenty would.

I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.

But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.

Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.


Make is in POSIX, so it's generally available. Same reason people write shell scripts (even if the scripts are not generally POSIX-only).

Unless your company forces you to use Windows, which is still much more common than many would like to admit. And yes, WSL exists, but in my experience, if a company is unwilling to allow macOS, there’s a good chance they either don’t allow enabling HyperV, or the security software they use is such garbage that it results in a HyperV enabled system being effectively unusable.

Windows 11 requires Hyper-V turned on, virtualization based security is one of the reasons of the forced hardware upgrades.

Unless the company also forcefully prevents installation of git bash, you'll have make as well.

I know, I know, what developer on windows uses git via the official git windows client, right? /s


I like it because it's language and tooling agnostic, declarative, fast and ubiquitous.

Where it's less great is complicated recipes and debugging


Make is not language agnostic; it has implicit rules for compiling C.

it also has implicit rules for other languages, why would that make it non-agnostic?

> Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.

Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.

I wonder: which tool do you believe does this better than Make?


Tup. https://gittup.org/tup/ https://gittup.org/tup/make_vs_tup.html

But the Internet’s make mind-share means you still have to know make.

Edit: and make lets you use make to essentially run scripts/utils. People love to abuse make for that. Can’t do that with tup.


> Tup.

I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.


Abuse? Runnig linters, code analysers, configuration tools, template engines, spellcheckers, pulling dependencies, building dependencies with different build systems.

Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.


> Tup

`tup` relies on a stateful database, which makes it incomparable to `make`.


You can pass arguments via the command line (it is, after all, a CLI tool): https://stackoverflow.com/questions/2826029/passing-addition...

I mean there's really not much difference between "VAR=val make x" and "make x VAR=val" now is there?

Syntactically? No. Semantically? Yes.

I'm guessing the syntax was the part the poster was complaining about when they complained about the "dev experience".

Dunno, there are other aspects of environment variables that deteriorate the dev experience. They're very conducive to spooky action at a distance, since they're silently being passed along from parent-process to child-process (except when they aren't).

They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).


> except when they aren't

Like sudo for example.

So many problems related to that.


Because make is a prolog in disguise.

You don't have to write Make invocations by hand... It's just a tool that can be called from any editor or IDE (or by automatic file watchers). Environment variables aren't really relevant to Make either, unless you really want to misuse it as a command runner.

It's 80% of what you want and it's installed everywhere.

You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.

I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.


Yep, that's how I used it on the job before. "make test" would run tests locally and in CI pipeline, keeping the CI file refreshingly short at that point.

I often prefer to work in in extremis environments where there is no internet access, and hence, no easy way to get ahold of make; it's given me a bad habit of just waiting a build.bash script to do what make does most of the time. I haven't really found myself missing it that much.

If you can install bash on your airgapped dev box, why wouldn't you install make on it? Make is part of the core dev environment on just about every disto under the sun.

I've worked on payment terminals running linus that had a shell like busybox but no make.

Of course it's a damn payment terminal, no one is ever going to actually build the app on the terminal so it has no compiler either.

I'm wondering what sort of dev machine GP has that is used to build software but doesn't have make...


most minimal setups nowadays have bash, make and even perl.

I am confused, because this means that you won't be able to install anything. No compiler, no 3rd party libraries and no text editor that isn't preinstalled

you don't do it naked, you write and use wrapper scripts to make it ergonomic



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: