Requesting new Docker images with ghostscript

Magnus,

Would you be so kind as to build some new Docker images at your earliest convenience?

I added ghostscript as a Buildrequires to the specfiles. So no the builds are all failing … but at least not silently any longer :wink:

As mentioned separately, I also removed the reuirement for glade3-lingladeui-devel from the specfiles for fedora33 and later.

Cheers

Yes, I will do that. It takes a couple of hours to build them all so I plan to start a batch job when I call it a day here and push them in the morning.

They are not failing because of that. It’s because of your previous commit. See https://forum.graphviz.org/t/error-unable-to-find-a-match-glade3-libgladeui-devel/146/17

I’ve pushed mages for Centos7 & 8 now and triggered a new pipeline run.

Still not getting pdf in graphiz-doc

Most pdf are built during the portable-source stage. Did you update the Ubuntu Docker images?

No sorry. I will do it now.

I’ve pushed a new Docker image for Ubuntu 18.04 that is used in the pipeline to build the portable source and triggered a new pipeline run. It became it green :sunglasses:

Thanks. That was it. Now installed pdf files from the graphviz-doc package are non-zero length. E.g. /usr/share/doc/graphviz-doc/pdf/acyclic.1.pdf

1 Like

FYI the autotools setup does not check for the presence of ps2pdf. So if you clone the current head and try to build without it, you end up staring at the following confusing error:

groff -Tps -man ./cdt.3 | false - - >cdt.3.pdf
make[3]: *** [Makefile:1113: cdt.3.pdf] Error 1
make[3]: Leaving directory '/tmp/tmp.BcKHhQOZU0/graphviz/lib/cdt'
make[2]: *** [Makefile:591: all-recursive] Error 1
make[2]: Leaving directory '/tmp/tmp.BcKHhQOZU0/graphviz/lib'
make[1]: *** [Makefile:835: all-recursive] Error 1
make[1]: Leaving directory '/tmp/tmp.BcKHhQOZU0/graphviz'
make: *** [Makefile:643: all] Error 2

The error itself makes sense, but if it’s late in the day and your coffee buzz is waning you (me) spend a lot of time thinking to yourself, “well piping something into /bin/false is obviously going to fail, isn’t it?” Then you bisect, stare at the diff, and realize a missing ps2pdf is actually your problem.

I think we need to check for Ghostscript somewhere during configuration.

The check for ps2pdf is done from the configure script, which is generated from autogen.sh, and which is run during the first phase for generating portable-sources.

The reason that the errors from “false” started appearing yesterday was because I removed the preceding “-” character in the Makefile which was causing the errors to be suppressed.

The cause of the errors was that gostscript was no installed, so that ./configure didn’t find ps2pdf and instead substituted “false” for @PS2PDF@ which resulted correctly in the errors which had previously been incorrectly suppressed.

John

Yes, I think I followed this. However, my confusion was that I expected ./configure to fail if it found something missing that would result in a subsequent make failing. Is this just an incorrect assumption on my part?

So if you clone the current head and try to build without it, you end up staring at the following confusing error

Does “it” refer to autogen.sh?

You have to always run autogen.sh after a git clone or git pull. Otherwise your Makefiles are likely to be out of date.

Hope I’m not coming across as coffee deficient … just living under a quarantine and a curfew here in Florida tonight. I’m missing my walks to my favorite coffee shop for peaceful hacking sessions on my laptop.

John

No, sorry, “it” referred to ps2pdf. Right now (commit f38013efd5833c2638b6a7b0f24754a3bd55bb88) if you clone Graphviz into a new directory and you don’t have Ghostscript installed, you experience this:

$ git clone https://gitlab.com/graphviz/graphviz
...
$ cd graphviz
$ ./autogen.sh
...
$ echo $?
0
$ ./configure
...
$ echo $?
0
$ make
...
groff -Tps -man ./cdt.3 | false - - >cdt.3.pdf
make[3]: *** [Makefile:1113: cdt.3.pdf] Error 1
make[3]: Leaving directory '/tmp/tmp.FGrbRGWdJV/graphviz/lib/cdt'
make[2]: *** [Makefile:591: all-recursive] Error 1
make[2]: Leaving directory '/tmp/tmp.FGrbRGWdJV/graphviz/lib'
make[1]: *** [Makefile:835: all-recursive] Error 1
make[1]: Leaving directory '/tmp/tmp.FGrbRGWdJV/graphviz'
make: *** [Makefile:643: all] Error 2

This is on Debian Buster, but I would imagine this experience is constant across Linux.

Yes, correct, ps2pdf from the ghostscript package is required.

Previously the errors resulting from it not being insralled were being suppressed and as a result many .pdf files in the graphviz-doc package were zero sized.

Those errors are no longer suppressed.

Yes, I follow that. Sorry, I feel like we’re talking at cross purposes here. I agree that these commands should fail the build if they fail, and I’m not sure why their failure was previously ignored. However, I was expecting ./configure to notice the absence of ps2pdf and either (a) fail itself or (b) conditionally disable that part of the build. Am I expecting the wrong outcome here? I don’t disagree with the changes made prior, but I’m wondering if we should make ./configure check for ps2pdf.

I don’t think we have any, or at least not many, hard errors from ./configure. Mostly ./configure tries to see what is there and select alternatives if possible.

I guess we can debate if the error of a missing ps2pdf should be generated by confugure, or something else. To me the problem was that the error wasn’t being reported at all, it was silent, and the generated graphvuz-doc package was essentially useless.

We could probably add some clearer warning from configure for missing ps2pdf … but at least it isn’t silent now.

Fair enough. I guess that’s sort of the underlying reason behind graphviz#1707 too.

By the way, I asked somewhere else but never really got a definitive answer: why do we have an autotools build system as well as a CMake one? Is one deprecated and we should be aiming to remove it?

1 Like

Autotools is the older one, the one I’ve mostly maintained for the last 15? years. I introduced it at a time when we had just Makefiles.

Cmake is more recent, and not of my creation or maintenance. If its better in some way then I’m not resisting, but I do want the autotools maintained until something else if fully capable on all dustros . Autotools is still fairly common in the RedHat world, I think.

1 Like

I have a preference for CMake, but only because I don’t know autotools well. However, my main concern is that we seem to be maintaining two build systems in parallel. The CMake one seems not tested and, judging by Stephen’s experience, is partially broken. Should we remove it and retain it in a branch for future? It doesn’t feel like we have the resources (or a reason?) to maintain two functioning build systems at once.

1 Like

FYI: CMake was introduced by Erwin Janssen late 2016 for use on Windows where we also have two build systems, CMake and MSBuild. We test both in CI in Appveyor.

I think that the reason you seem to be talking across purposes above may partly be that it is somewhat unclear what the overall philosophy of the build system should be. Should it always try to build as much as possible, given what is installed in the system it runs on, and keep quiet (not fail) on the rest or should it fail when it cannot build something? There seems to be a mixture of these here. Which may also be fine if it’s clear what should be considered essential and cause failure if it cannot be built.

That said, during my work with the Dockerfiles, I would have much more preferred if it failed on everything it couldn’t build because then it would have been much easier to detect if something was missing. What I had to do then was to diff the log files from the (then) existing builds on John’s private runners with those from my Docker runs to see that the same things were built or not built (per platform :cold_sweat:). I even made a small tool to do it for me. I still find this cumbersome when making changes to the Dockerfiles.

That said (again), I do realize that such a methodology would not serve end users well. They wouldn’t be able to build anything without errors if they couldn’t build everything. Which nicely leads up to the question you were discussing; should a missing ps2pdf fail the build or not and if so, in what stage? If I’m an end user, I would say No and if I’m a maintainer of Dockerfiles, I would say Yes, as early as possible.

We have to decide (or make it clear if it’s already decided) on the overall philosophy before going into details. If if there was a way (maybe there is) to have the end user way be the default and get the maintainer way as an option, that would be awesome.