Hi, When compiling gnupdf I noticed that features are enabled or disabled based on their presence on the system.
I don't like this, because users usually don't know if they compiled the library the recommended way or not. For example, some times ago I recompiled Gtk because I wanted to try a newer Gimp, and my distro's Gtk was too old. So I got the tarball, ./configure; make; make install, etc. Well not so easy, a number of base dev dependencies had to be installed. But at the next reboot, my desktop was in a mess because suddendly Gtk didn't have SVG support anymore! I want Gtk to require SVG unless I explicitely disable it. So in my projects (namely FreeDink), 'configure' enables or disables dependencies based on --enable/--disable options only. The libraries detection produces errors about missing dependencies, and doesn't silently change the default build configuration based on the absence (or misdetection) of a library. Autoconf tests are still used normally to replace missing host POSIX features - that is, autodetection is used when it doesn't change the behavior of the program. Similarly FreeDink supports two Zip libraries (zziplib and libzip, to load embedded executable resources) and automatically compiles against the first it finds - this doesn't have a functional impact. And this can be explicitely disabled when there's no need for it (e.g. distros that strip packaged executables). Cf. http://git.savannah.gnu.org/cgit/freedink.git/tree/configure.ac Having a clear and static recommended build was welcome by packagers, who sometimes (mistakenly) consider autoconf with suspicion because a lot of 'configure' scripts don't have a consistent behavior. This also makes people gain time by avoiding a few recompilations to enable features that were disabled due to a missing -dev package. What do you think? I can provide patches for this, this isn't much work :) -- Sylvain
