Also there was a post about why just generating ninja using python can be a good option. I do this in my project and it has been very productive so far. I couldn’t find this post now but it was saying to use ninja_syntax.py from ninja codebase and just doing something minimal for a project
> A good tool solves its purpose so directly that it's not even noticed. Great designs need fewer options because they just work as is. The reason the user opened the the software is to achieve some end, not poke around in the software's options. Maybe you can come up with a better design that addresses the use cases of the option?
This is the opinion of people who don't use the SW they wrote.
And no, gray on gray is not an option.
Unfortunately, we are forced to used SW bought by other departments, else nobody will use those crap.
You can't achieve an end when the crap you use isn't working.
we used ninja as a parallel task runner in pytype - had our whole-project analyser generate a ninja file with a task graph, and then just evoke ninja to run it, taking care of dependencies and parallel execution. it worked very nicely indeed.
If someone sees this: The ninja package on PyPI [0] currently stays at version 1.13.0 . There is an issue in 1.13.0 preventing it building projects on Windows. The issue is already fixed in 1.13.1 almost a year ago, but the PyPI package hasn't got an update, see [1], and many downstream projects have to stay at 1.11 . I hope it could update soon.
Yes, but I mean... this is Python we're talking about. There are several build systems / coordinators written in Python (scons, colcon, etc) not to mention Python packages that themselves contain compiled bits written in other languages.
I know nowadays we have formalized, cross-platform ways to build bindings (scikit-build-core, etc), but that is a relatively recent development; for a long ass time it was pretty common place to have a setup.py full of shell-outs to native toolchains and build tools. It's not hard to imagine a person in that headspace feeling like being able to pull that stuff directly from pypi would be an upgrade over trying to detect it missing and instruct the user to install it before trying again.
What a messy and frankly, absurd situation to be left in. To fork a project in order to provide a tool through Pypi, only to then stop updating it on a broken version. That's more a disservice than a service for the community... If you're going to stay stuck, better to drop the broken release and stay stuck on the previous working one.
All the main build tools (cmake, meson/ninja and GNU configure) have different benefits. For instance, I expect "--help" to work, but only really GNU configure supports it as-is. I could list more advantages and disadvantages in general here, but by and large I prefer meson/ninja. To me it feels by far the fastest and I also have the fewest issues usually (excluding python breaking its pip stack but that's not the fault of meson as such). ninja can be used via cmake too but most uses I see are from meson.
> ninja can be used via cmake too but most uses I see are from meson
How do you know though when the choice of cmake-generator is entirely up to the user? E.g. you can't look at a cmake file and know what generator the user will select to build the project.
FWIW I usually prefer the Ninja generator over the Makefile generator since ninja better 'auto-parallelises' - e.g. with the Makefile generator the two 'simple' options are either to run the build single-threaded or completely grind the machine to a halt because the default setting for 'parallel build' seems to heavily overcommit hardware resources. Ninja just generally does the right thing (run parallel build, but not enough parallelism to make the computer unusable).
ninja supports separate build groups and different max number of parallel jobs for each. CMake's ninja generator puts compilation and linking steps in their own respective groups. End result is by default nproc parallel jobs for compilation but 1 job for linking. This helps because linking can be way more memory intensive or sometimes the linker itself has support for parallelism. Most projects have only a handful of linking steps to run anyway.
I find Meson's --help fairly useful, at least compared to the disaster that is CMake's. (Try to find out, as a user not experienced with either, how you'd make a debug build.) I agree that configure --help is more useful for surfacing project-specific options, though.
Similar to make, it does mtime chronological comparison of dependencies with target to determinate if dependencies changed. This is just so flawed and simple to fool by operations on filesystem that do not change mtime (move, rename):
1) pick a source file and make a copy of it for for later
2) edit selected source file and rebuild
3) move the copy to it's original location
4) try to rebuild, nothing happens
By not tracking file metadata through an index file, mtime-only incremental build systems trade a lot of reliability for only slightly more simplicity. https://apenwarr.ca/log/20181113
Copy (1) and edit (2) both bump mtime, usually. It's not obvious that in the workflow you describe ninja is problematic, rather than the workflow itself (which is atypical).
Good point. I think it would be fixable by using the Change Time instead of the Modify Time, because that changes when moving the copy over the original.
An under noticed ninja feature I adore, which was implemented relatively recently, is the ability to configure how its build progress is printed. In my fish config, I have the NINJA_STATUS envvar:
64 comments
Some blog posts from the creator of ninja:
https://neugierig.org/software/blog/2018/07/options.html
https://neugierig.org/software/blog/2011/04/complexity.html
Also there was a post about why just generating ninja using python can be a good option. I do this in my project and it has been very productive so far. I couldn’t find this post now but it was saying to use ninja_syntax.py from ninja codebase and just doing something minimal for a project
https://neugierig.org/software/blog/2018/07/options.html
(hello FreeCAD ;)
> Options are bad user experience
> A good tool solves its purpose so directly that it's not even noticed. Great designs need fewer options because they just work as is. The reason the user opened the the software is to achieve some end, not poke around in the software's options. Maybe you can come up with a better design that addresses the use cases of the option?
This is the opinion of people who don't use the SW they wrote.
And no, gray on gray is not an option.
Unfortunately, we are forced to used SW bought by other departments, else nobody will use those crap.
You can't achieve an end when the crap you use isn't working.
It's good to be critical about options, but ultimately people and their needs are diverse and good tools recognize that too.
[0] https://pypi.org/project/ninja/
[1] https://github.com/scikit-build/ninja-python-distributions/i...
I know nowadays we have formalized, cross-platform ways to build bindings (scikit-build-core, etc), but that is a relatively recent development; for a long ass time it was pretty common place to have a setup.py full of shell-outs to native toolchains and build tools. It's not hard to imagine a person in that headspace feeling like being able to pull that stuff directly from pypi would be an upgrade over trying to detect it missing and instruct the user to install it before trying again.
https://neugierig.org/software/chromium/notes/2011/02/ninja....
Hence, it's used in a lot of Google projects.
> ninja can be used via cmake too but most uses I see are from meson
How do you know though when the choice of cmake-generator is entirely up to the user? E.g. you can't look at a cmake file and know what generator the user will select to build the project.
FWIW I usually prefer the Ninja generator over the Makefile generator since ninja better 'auto-parallelises' - e.g. with the Makefile generator the two 'simple' options are either to run the build single-threaded or completely grind the machine to a halt because the default setting for 'parallel build' seems to heavily overcommit hardware resources. Ninja just generally does the right thing (run parallel build, but not enough parallelism to make the computer unusable).
nprocparallel jobs for compilation but 1 job for linking. This helps because linking can be way more memory intensive or sometimes the linker itself has support for parallelism. Most projects have only a handful of linking steps to run anyway.It's syntax is simple enough that it's trivial to e.g. write a shell script to generate the build items if you need dynamic dependencies.
1) pick a source file and make a copy of it for for later 2) edit selected source file and rebuild 3) move the copy to it's original location 4) try to rebuild, nothing happens
There is (at least) one open issue about this - the solution/alternatives are not trivial:
https://github.com/ninja-build/ninja/issues/1459
NINJA_STATUSenvvar: Which prints the time elapsed and projected in a readable multi-line format.