Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What's the rationale for using makefiles as script runners over just having a directory with scripts inside?

Not for compiling, just as script runner.

I see this practice often and I haven't found a good reason



If scripts need particular arguments the make is a good place to record them.

I use it quite a lot for automating deployments - if you want to Terraform up a VM:

  make colo1-fooserver01.vm
Then if you want to run Ansible against it:

  make colo1-fooserver01
You don’t have remember or type all of the flags or arguments - just type make and hit tab for a list of the targets that you can build


Most shells will tab complete after `./scripts/` too. In fact that's probably more common than make completion.

I think the real reason is you have it all in one file rather than multiple scripts which makes it easier to edit and maintain.


Quite - one makefile rather than dozens of scripts which all do practically the same things.


But this can literally just be done in a simple shell script as well. The makefile ends up just being a redundant way to run a shell script.


> But this can literally just be done in a simple shell script as well.

Only if there's no dependencies. It's unusual that GP's type of usage has no dependencies.


When my shell scripts depend on another script ... they run the other script. Make definitely has its place, especially when dependencies get complex and parallel, but it's hardly necessary for simple cases. Once Make is needed, it's trivial to drop in and have it wrap the standalone scripts.


> When my shell scripts depend on another script ... they run the other script.

I hear you, but you're running the other script unconditionally. If it downloads something, it will download it every time you run the first script.

In this simple case, make runs the other script conditionally, so it need not run every time.


I build .tf files from parameters for each host in the Makefile (and script which knows the vSphere topology) for one-shot execution (it only creates the VM, it doesn’t manage the lifecycle) and also template config that needs to be done before deployment - there are plenty of dependencies


Dependency management, definitely. Loads of scripts don't work until X has been done, and X, Y, Z, and sometimes QWERTY have to be done first, and they take minutes and a ton of bandwidth so you don't want to do them unless you have to...

... and if your scripts do all that, they've basically rebuilt make, but it's undocumented and worse.

(I say this as someone with LOTS of experience with make, and am not really a fan because I know too much and it's horrifying. But I dislike custom crippled versions even more.)


It can help abstract the differences you may have across projects. If you're on a team with many projects/repositories, having conventions across them all helps improve onboarding, cross-teamwork and promotes better dev ux. A really simple way to do this is make. It lets you have the common targets and convert them to the relevant target. This can become more useful as you write automation for CI and deployments for all your projects.


Likely a declarative way to specify dependencies. But not sure if make as a tool for that is the best option in general.


Dependency management and automatic parallelization (via `make -j`).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: