Yes, I will do that. It takes a couple of hours to build them all so I plan to start a batch job when I call it a day here and push them in the morning.
I’ve pushed a new Docker image for Ubuntu 18.04 that is used in the pipeline to build the portable source and triggered a new pipeline run. It became it green
FYI the autotools setup does not check for the presence of ps2pdf. So if you clone the current head and try to build without it, you end up staring at the following confusing error:
The error itself makes sense, but if it’s late in the day and your coffee buzz is waning you (me) spend a lot of time thinking to yourself, “well piping something into /bin/false is obviously going to fail, isn’t it?” Then you bisect, stare at the diff, and realize a missing ps2pdf is actually your problem.
I think we need to check for Ghostscript somewhere during configuration.
The check for ps2pdf is done from the configure script, which is generated from autogen.sh, and which is run during the first phase for generating portable-sources.
The reason that the errors from “false” started appearing yesterday was because I removed the preceding “-” character in the Makefile which was causing the errors to be suppressed.
The cause of the errors was that gostscript was no installed, so that ./configure didn’t find ps2pdf and instead substituted “false” for @PS2PDF@ which resulted correctly in the errors which had previously been incorrectly suppressed.
Yes, I think I followed this. However, my confusion was that I expected ./configure to fail if it found something missing that would result in a subsequent make failing. Is this just an incorrect assumption on my part?
So if you clone the current head and try to build without it, you end up staring at the following confusing error
Does “it” refer to autogen.sh?
You have to always run autogen.sh after a git clone or git pull. Otherwise your Makefiles are likely to be out of date.
Hope I’m not coming across as coffee deficient … just living under a quarantine and a curfew here in Florida tonight. I’m missing my walks to my favorite coffee shop for peaceful hacking sessions on my laptop.
No, sorry, “it” referred to ps2pdf. Right now (commit f38013efd5833c2638b6a7b0f24754a3bd55bb88) if you clone Graphviz into a new directory and you don’t have Ghostscript installed, you experience this:
Yes, correct, ps2pdf from the ghostscript package is required.
Previously the errors resulting from it not being insralled were being suppressed and as a result many .pdf files in the graphviz-doc package were zero sized.
Yes, I follow that. Sorry, I feel like we’re talking at cross purposes here. I agree that these commands should fail the build if they fail, and I’m not sure why their failure was previously ignored. However, I was expecting ./configure to notice the absence of ps2pdf and either (a) fail itself or (b) conditionally disable that part of the build. Am I expecting the wrong outcome here? I don’t disagree with the changes made prior, but I’m wondering if we should make ./configure check for ps2pdf.
I don’t think we have any, or at least not many, hard errors from ./configure. Mostly ./configure tries to see what is there and select alternatives if possible.
I guess we can debate if the error of a missing ps2pdf should be generated by confugure, or something else. To me the problem was that the error wasn’t being reported at all, it was silent, and the generated graphvuz-doc package was essentially useless.
We could probably add some clearer warning from configure for missing ps2pdf … but at least it isn’t silent now.
Fair enough. I guess that’s sort of the underlying reason behind graphviz#1707 too.
By the way, I asked somewhere else but never really got a definitive answer: why do we have an autotools build system as well as a CMake one? Is one deprecated and we should be aiming to remove it?
Autotools is the older one, the one I’ve mostly maintained for the last 15? years. I introduced it at a time when we had just Makefiles.
Cmake is more recent, and not of my creation or maintenance. If its better in some way then I’m not resisting, but I do want the autotools maintained until something else if fully capable on all dustros . Autotools is still fairly common in the RedHat world, I think.
I have a preference for CMake, but only because I don’t know autotools well. However, my main concern is that we seem to be maintaining two build systems in parallel. The CMake one seems not tested and, judging by Stephen’s experience, is partially broken. Should we remove it and retain it in a branch for future? It doesn’t feel like we have the resources (or a reason?) to maintain two functioning build systems at once.
FYI: CMake was introduced by Erwin Janssen late 2016 for use on Windows where we also have two build systems, CMake and MSBuild. We test both in CI in Appveyor.
I think that the reason you seem to be talking across purposes above may partly be that it is somewhat unclear what the overall philosophy of the build system should be. Should it always try to build as much as possible, given what is installed in the system it runs on, and keep quiet (not fail) on the rest or should it fail when it cannot build something? There seems to be a mixture of these here. Which may also be fine if it’s clear what should be considered essential and cause failure if it cannot be built.
That said, during my work with the Dockerfiles, I would have much more preferred if it failed on everything it couldn’t build because then it would have been much easier to detect if something was missing. What I had to do then was to diff the log files from the (then) existing builds on John’s private runners with those from my Docker runs to see that the same things were built or not built (per platform). I even made a small tool to do it for me. I still find this cumbersome when making changes to the Dockerfiles.
That said (again), I do realize that such a methodology would not serve end users well. They wouldn’t be able to build anything without errors if they couldn’t build everything. Which nicely leads up to the question you were discussing; should a missing ps2pdf fail the build or not and if so, in what stage? If I’m an end user, I would say No and if I’m a maintainer of Dockerfiles, I would say Yes, as early as possible.
We have to decide (or make it clear if it’s already decided) on the overall philosophy before going into details. If if there was a way (maybe there is) to have the end user way be the default and get the maintainer way as an option, that would be awesome.