Ninja is a small build system with a focus on speed (github.com)

by tosh 64 comments 112 points
Read article View on HN

64 comments

[−] ozgrakkurt 46d ago
Ninja is one of the best tools I have used. It is extremely simple and always works flawlessly.

Some blog posts from the creator of ninja:

https://neugierig.org/software/blog/2018/07/options.html

https://neugierig.org/software/blog/2011/04/complexity.html

Also there was a post about why just generating ninja using python can be a good option. I do this in my project and it has been very productive so far. I couldn’t find this post now but it was saying to use ninja_syntax.py from ninja codebase and just doing something minimal for a project

[−] fainpul 46d ago
OMG, so many open source projects need to read this:

https://neugierig.org/software/blog/2018/07/options.html

(hello FreeCAD ;)

[−] hulitu 43d ago

> Options are bad user experience

> A good tool solves its purpose so directly that it's not even noticed. Great designs need fewer options because they just work as is. The reason the user opened the the software is to achieve some end, not poke around in the software's options. Maybe you can come up with a better design that addresses the use cases of the option?

This is the opinion of people who don't use the SW they wrote.

And no, gray on gray is not an option.

Unfortunately, we are forced to used SW bought by other departments, else nobody will use those crap.

You can't achieve an end when the crap you use isn't working.

[−] seba_dos1 46d ago
Seems like we so often get projects made by people who either need to read this, or who stubbornly ignore its "But first" paragraph.

It's good to be critical about options, but ultimately people and their needs are diverse and good tools recognize that too.

[−] zem 46d ago
we used ninja as a parallel task runner in pytype - had our whole-project analyser generate a ninja file with a task graph, and then just evoke ninja to run it, taking care of dependencies and parallel execution. it worked very nicely indeed.
[−] woctordho 47d ago
If someone sees this: The ninja package on PyPI [0] currently stays at version 1.13.0 . There is an issue in 1.13.0 preventing it building projects on Windows. The issue is already fixed in 1.13.1 almost a year ago, but the PyPI package hasn't got an update, see [1], and many downstream projects have to stay at 1.11 . I hope it could update soon.

[0] https://pypi.org/project/ninja/

[1] https://github.com/scikit-build/ninja-python-distributions/i...

[−] endgame 47d ago
Why is a C++ project being distributed on PyPi at all?
[−] grim_io 47d ago
Probably for the same reason other binaries are distributed by npm: lack of cross platform general package managers and registries
[−] mikepurvis 46d ago
Also for cases where a python project needs to depend on it.
[−] Ferret7446 46d ago
Kinda weird to have the language toolchain wrap the build system, should be the other way around.
[−] mikepurvis 45d ago
Yes, but I mean... this is Python we're talking about. There are several build systems / coordinators written in Python (scons, colcon, etc) not to mention Python packages that themselves contain compiled bits written in other languages.

I know nowadays we have formalized, cross-platform ways to build bindings (scikit-build-core, etc), but that is a relatively recent development; for a long ass time it was pretty common place to have a setup.py full of shell-outs to native toolchains and build tools. It's not hard to imagine a person in that headspace feeling like being able to pull that stuff directly from pypi would be an upgrade over trying to detect it missing and instruct the user to install it before trying again.

[−] verdverm 46d ago
Or lack of a tool like Goreleaser in the language ecosystem that handles that
[−] zahlman 46d ago
You may be interested in this discussion: https://discuss.python.org/t/use-of-pypi-as-a-generic-storag...
[−] hulitu 43d ago
You need to bundle a supply chain attack with it. /s
[−] tadfisher 46d ago
Because the development world either hasn't heard of nix or has collectively decided to not use nix.
[−] j1elo 46d ago
What a messy and frankly, absurd situation to be left in. To fork a project in order to provide a tool through Pypi, only to then stop updating it on a broken version. That's more a disservice than a service for the community... If you're going to stay stuck, better to drop the broken release and stay stuck on the previous working one.
[−] bsimpson 46d ago
Evan Martin (evmar) started Ninja when he was working on Chrome at Google:

https://neugierig.org/software/chromium/notes/2011/02/ninja....

Hence, it's used in a lot of Google projects.

[−] shevy-java 47d ago
All the main build tools (cmake, meson/ninja and GNU configure) have different benefits. For instance, I expect "--help" to work, but only really GNU configure supports it as-is. I could list more advantages and disadvantages in general here, but by and large I prefer meson/ninja. To me it feels by far the fastest and I also have the fewest issues usually (excluding python breaking its pip stack but that's not the fault of meson as such). ninja can be used via cmake too but most uses I see are from meson.
[−] flohofwoe 47d ago

> ninja can be used via cmake too but most uses I see are from meson

How do you know though when the choice of cmake-generator is entirely up to the user? E.g. you can't look at a cmake file and know what generator the user will select to build the project.

FWIW I usually prefer the Ninja generator over the Makefile generator since ninja better 'auto-parallelises' - e.g. with the Makefile generator the two 'simple' options are either to run the build single-threaded or completely grind the machine to a halt because the default setting for 'parallel build' seems to heavily overcommit hardware resources. Ninja just generally does the right thing (run parallel build, but not enough parallelism to make the computer unusable).

[−] plq 47d ago
ninja supports separate build groups and different max number of parallel jobs for each. CMake's ninja generator puts compilation and linking steps in their own respective groups. End result is by default nproc parallel jobs for compilation but 1 job for linking. This helps because linking can be way more memory intensive or sometimes the linker itself has support for parallelism. Most projects have only a handful of linking steps to run anyway.
[−] Sesse__ 46d ago
I find Meson's --help fairly useful, at least compared to the disaster that is CMake's. (Try to find out, as a user not experienced with either, how you'd make a debug build.) I agree that configure --help is more useful for surfacing project-specific options, though.
[−] aidenn0 46d ago
Ninja is possibly the best example of the "Do one thing and do it well" philosophy. All it does is execute commands based on a static build graph.

It's syntax is simple enough that it's trivial to e.g. write a shell script to generate the build items if you need dynamic dependencies.

[−] mizmar 46d ago
Similar to make, it does mtime chronological comparison of dependencies with target to determinate if dependencies changed. This is just so flawed and simple to fool by operations on filesystem that do not change mtime (move, rename):

1) pick a source file and make a copy of it for for later 2) edit selected source file and rebuild 3) move the copy to it's original location 4) try to rebuild, nothing happens

[−] evmar 46d ago
[ninja author] I did some thinking about this problem and eventually revisited with what I think is a pretty neat solution. I wrote about it here: https://neugierig.org/software/blog/2022/03/n2.html
[−] actionfromafar 46d ago
Imagine if filesystems had exposed the file hash next to its mtime.
[−] oftenwrong 46d ago
I might be missing your sarcasm, but this is a common approach for large scale builds. Virtual filesystems are used to provide a pre-computed tree hash as a xattr. In a more typical case, you can read the git tree hash.
[−] actionfromafar 46d ago
Not sure it was meant as sarcasm really. I just think so many build (and other) problems could have been avoided it a file hash was available on every file by default.
[−] sagarm 45d ago
That hash would be expensive to maintain, and the end result would still be racy since the file could be modified after the hash was read .
[−] actionfromafar 45d ago
In the current POSIX paradigm yes, it would be expensive. But if the hash was defined as the hash of fixed blocks, it wouldn't be expensive. The raciness depends, a lot, on the semantics we would define. (In the context of a build system, it's no different than that the file could get a new mtime after we read the mtime.)
[−] amavect 46d ago
By not tracking file metadata through an index file, mtime-only incremental build systems trade a lot of reliability for only slightly more simplicity. https://apenwarr.ca/log/20181113
[−] loeg 46d ago
Copy (1) and edit (2) both bump mtime, usually. It's not obvious that in the workflow you describe ninja is problematic, rather than the workflow itself (which is atypical).
[−] jasonpeacock 46d ago
My guess is that it's for drop-in compatibility with make.

There is (at least) one open issue about this - the solution/alternatives are not trivial:

https://github.com/ninja-build/ninja/issues/1459

[−] jhasse 46d ago
Good point. I think it would be fixable by using the Change Time instead of the Modify Time, because that changes when moving the copy over the original.
[−] hrmtst93837 46d ago
[flagged]
[−] Conscat 46d ago
An under noticed ninja feature I adore, which was implemented relatively recently, is the ability to configure how its build progress is printed. In my fish config, I have the NINJA_STATUS envvar:

    set -x NINJA_STATUS "STEP: %f/%t  
    [%p / %P] 
    [%w + %W]
    "
Which prints the time elapsed and projected in a readable multi-line format.
[−] jbonatakis 46d ago
Postgres uses Meson+Ninja in their builds. That seems like a pretty big endorsement.
[−] jonstewart 47d ago
The absolute best thing about coding agents is not having to waste time on build systems. I had Claude code port my autotools scripts to meson (which uses ninja) and it’s been a huge quality of life improvement.
[−] throwaway2046 46d ago
Ninja is great and feels natural coming from Make. What it lacks in features it makes up for with speed, which is what ultimately matters.

Also worth mentioning is samurai[1], a pure C implementation of Ninja that's almost as fast yet easier to bootstrap needing only a C compiler.

[1] https://github.com/michaelforney/samurai

[−] Svoka 46d ago
Ninja religions following of treating timestamps (mtime) as 'modified' marker makes it useless with Git and large projects.

You switched some branches back and forward? Enjoy your 20 minutes rebuild.

* https://github.com/ninja-build/ninja/issues/1459

[−] HiPhish 46d ago
Serious question: how can a build tool be fast or slow? From my understanding all it does is delegate the build steps to other tools, so wouldn't those be the bottleneck? Is it the resolution of order of build steps that takes so much time that a different build system can make a difference?
[−] sluongng 47d ago
My teammate has a great time reimplementing Ninja (slop-free) in Go here https://github.com/buildbuddy-io/reninja to make it even faster with Remote Build Execution.
[−] p4bl0 46d ago
I used ninja only a few years ago when contributing to KDE software (Dolphin, Kate, KTextEditor, etc.). I had no prior experience with it and it was easy to apprehend, so a rather good experience.
[−] throwaway613746 46d ago
[dead]
[−] chmorgan_ 47d ago
[dead]
[−] amelius 46d ago
I can remember having to uninstall ninja temporarily because it messed with building packages. I only use it because other packages need it.