Re: How to force libtool to use CXX mode?

2024-05-13 Thread Bob Friesenhahn

On 5/13/24 16:37, Karl Berry wrote:

 convince Automake to force libtool to link using the C++ compiler

When there are no C++ sources? Why? Just trying to understand ...
There are no C++ sources from my project, but there is a C++ compiler 
used for the overall build. As clarification, this is for oss-fuzz's 
clang ubsan build, which is an almost completely static build so there 
are no shared libraries to provide their dependencies by default.  When 
clang++ is enabled for ubsan, it adds additional dependency libraries 
(presumably -lubsan). The theory is that if linking is done using the 
same compiler then the dependency libraries brought in due the compiler 
mode will be applied automatically.

I'm sorry Bob, but I just don't know.  Maybe the just-released
libtool-2.5.0 alpha offers some new help?

I don't think so.

If there is some bug in or feature for Automake that would help, I'm
open to suggestions (and patches). It kind of sounds like more on the
libtool side? --sorry, karl.


Automake does have a critical bug in that for a target which only 
optionally has C++ sources, that target is always linked using C++. 
Without this issue, the trick of including an empty optional C++ source 
file in the build would work. But I do not want GraphicsMagick to 
require a C++ compiler.


I have been working to supply replacement's to Automake's target 
definitions, but it is very messy and not future safe.


Without support for this in Automake, I feel that there is too much 
effort, and future risk.


Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




How to force libtool to use CXX mode?

2024-05-12 Thread Bob Friesenhahn
I have expended quite a few days already (over a 6 month span) with 
attempting to convince Automake to force libtool to link using the C++ 
compiler.  I tried optionally adding an empty C++ source file to the 
target build but this does not work because then Automake always assumes 
C++ linkage, even if no C++ source files were included. I previously 
encountered a libtool-related bug where it determines that the C 
compiler needs -lm, but the C++ compiler does not, stripping the -lm and 
resulting in a failed link.


Today I read this Automake manual page:

https://www.gnu.org/software/automake/manual/html_node/Libtool-Flags.html

and so I used the program_LIBTOOLFLAGS or library_LIBTOOLFLAGS to add 
--tag=CXX to the options.  I thought that this must be working.


I found that it is not working as desired.  I end up with this bad case 
for compiling C code (did not want to have the --tag=CXX!)


    /bin/bash ./libtool  --tag=CC --tag=CXX  --mode=compile clang ...

and this command to link the C code using the C++ compiler (should be 
clang++!)


    /bin/bash ./libtool  --tag=CC --tag=CXX  --mode=link clang

Libtool misreports the tag it is using so it reports CCLD when the tag 
is presumably overridden by CXX.


The end result fails.

Is there a way to solve this problem beyond replicating many modified 
Makefile.in hunks into Makefile.am?


I am faced with backing out my attempt for yet another time. :-(

Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-04-02 Thread Bob Friesenhahn

On 4/2/24 16:42, Richard Stallman wrote:


[[[ To any NSA and FBI agents reading my email: please consider]]]
[[[ whether defending the US Constitution against all enemies, ]]]
[[[ foreign or domestic, requires you to follow Snowden's example. ]]]

   > My first thought was that Autoconf is a relatively trivial attack vector
   > since it is so complex and the syntax used for some parts (e.g. m4 and
   > shell scripts) is so arcane.  In particular, it is common for Autotools
   > stuff to be installed on a computer (e.g. by installing a package from
   > an OS package manager) and then used while building.  For example, there
   > are large collections of ".m4" files installed.  If one of the m4 files
   > consumed has been modified, then the resulting configure script has been
   > modified.

Can anyone think of a feasible way to prevent this sort of attack?
A common way would be to use PGP signing to bless a set of files. 
Perhaps a manifest which specifies the file names/paths and their sha256 
would be sufficient.  But there needs to be a way to augment this in 
case there are multiple collections of blessed files, including those 
blessed by the user.

   > It may be that an OS package manager

What is an "OS package manager"?


A popular OS package manager is Debian 'apt'. Well designed ones provide 
a way to test if installed files on the system have been modified.


But I only use this as an example since I don't think that any GNU build 
system should depend on something specific to an operating system.



Could you say concretely what this would do?  Which files do you have
in mind?  The m4 files discussed above?


M4 files, scripts, templates, and any other standard files which may be 
assimilated as part of the build process.



   > If installed files were themselves independently signed (or sha256s of
   > the files are contained in a signed manifest), and Autotools was able to
   > validate them while copying into a project ("bootstrapping"), then at
   > least there is some assurance that the many files which were consumed
   > have not been subverted.

Is this a proposal to deal with the problem described above?  I think
maybe it is, but things are not concrete enough for me to tell for
certain.


I do not think that it would solve the specific issues which lead to the 
xz-utils backdoor, but it may solve a large class of issues which have 
been ignored up until now.  People preparing operating system 
distributions solve such issues via the extensive (and repeatable) 
processes that they use.


GNU software developers are less likely (or able) to solve issues via 
extensive processes.  They expect that 'make distcheck' will prepare a 
clean distribution tarball.


Bob

--

Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-04-02 Thread Bob Friesenhahn

I'm also wondering whether the GNU system should recommend using zstd
instead of or in addition to xz for compression purposes.  Automake
gained support for dist-zstd back in 2019 [1], but I'm not sure how
many projects are using it yet.

[1] https://git.savannah.gnu.org/cgit/automake.git/commit/?id=5c466eaf


For several years, GraphicsMagick distributed a tarball compressed to 
zstd format.  This started before Automake offered support for it.


I used these rules:

# Rules to build a .tar.zst tarball (zstd compression)
dist-zstd: distdir
    tardir=$(distdir) && $(am__tar) | ZSTD_CLEVEL=$${ZSTD_CLEVEL-22} 
zstd --ultra -c >$(distdir).tar.zst

    $(am__post_remove_distdir)

With these options, the zst tarball came withing a hare's breath of the 
xz compressed file size.  I did not find any drawbacks.


I also had good experience with 'lzip', which has the benefit of a very 
small implementation and more compact coding than xz uses.


I stopped distributing anything but xz format since that is what almost 
everyone was choosing to download.


Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-03-31 Thread Bob Friesenhahn

I think it is pretty clear by now. [1][2][3]

[1] https://boehs.org/node/everything-i-know-about-the-xz-backdoor
[2] https://news.ycombinator.com/item?id=39865810
[3] https://www.youtube.com/watch?v=Kw8MCN5uJPg


There is not much one can do when a maintainer with signing/release 
power does something intentionally wrong.


My GraphicsMagick oss-fuzz builds include xz and are still working (but 
with a few security issues open due to problems in xz). The URL used is 
https://github.com/xz-mirror/xz. When I visit that URL, I see this 
message "This repository has been archived by the owner on Aug 28, 2023. 
It is now read-only.", so it seems that this is a stale repository.  The 
upstream repository to it has been disabled.


Regardless, how can Autotool's based projects be more assured of 
security given how they are selectively assembled from "parts"? I have 
already been concerned about using any Autotools packages provided by 
the operating system, since they are likely dated, but may also have 
been modified by the distribution package maintainers.


Besides GNU Autoconf, Automake, and libtool, there are also several 
popular Autoconf macro archives. Sometimes components are automatically 
downloaded via build scripts. This is not at all a "safe" situation. 
There is quite a lot of trust, which may be unwarranted.


Should the GNU project itself perform an independent file verification 
of included Autotools files (Autoconf .m4 files, scripts, libtool, etc.) 
for all of the packages it distributes? Besides verifying the original 
files which are re-distributed, it might be necessary to verify that 
generated files are correct, and are in fact based on the files which 
are re-distributed.


Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-03-31 Thread Bob Friesenhahn

On 3/30/24 19:00, Alexandre Oliva wrote:


Bluntly, I don't think it would help with security.  The attacker would
just have to disable or adjust the distcheck target to seemingly pass.

Relying on something in a code repository to tell whether the repository
is secure is akin to tying a dog with sausage.

For security proper, the verification code needs to be held elsewhere,
not compromisable along with the thing it's supposed to verify.

Analogously, you don't run a rootkit checker on the system that's
potentially compromised, because the rootkit may hide itself; you boot
off secure media and then use the tools in it to look for the rootkit in
the potentially-compromised system, *without* handing control over to
it.


I am on the oss-security mailing list where this issue was perhaps first 
publicly reported, and has been discussed/analyzed furiously.


My first thought was that Autoconf is a relatively trivial attack vector 
since it is so complex and the syntax used for some parts (e.g. m4 and 
shell scripts) is so arcane.  In particular, it is common for Autotools 
stuff to be installed on a computer (e.g. by installing a package from 
an OS package manager) and then used while building.  For example, there 
are large collections of ".m4" files installed.  If one of the m4 files 
consumed has been modified, then the resulting configure script has been 
modified.


It may be that an OS package manager has the ability to validate already 
installed files, but this is not likely to be used.


If installed files were themselves independently signed (or sha256s of 
the files are contained in a signed manifest), and Autotools was able to 
validate them while copying into a project ("bootstrapping"), then at 
least there is some assurance that the many files which were consumed 
have not been subverted.  The same signed data could be used to detect 
if the files are modified after the initial bootstrap.


It seems common for OS distributions to modify some of the files 
(especially libtool related) so they differ from the original GNU versions.


The problem which happened with the xz utils software is that the 
maintainer signed a release package with his PGP key, but there were 
subtle changes in the released product.  It is not yet clear if the 
maintainer intentionally did this, or if the changes were introduced via 
a compromise of his computer.


Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




Re: C library promoted to C++ linkage due to optional C++ source module

2024-03-09 Thread Bob Friesenhahn




Hi Bob,

 It is my opinion that if a library or program
 is linked with C++ libraries (and especially if it is a static build!)
 that it should be linked using the C++ linker.  Likewise, if a library
 or program does not depend on, or contain any C++ code, it should be
 linked with the C linker.

I surely agree, in principle. But that doesn't mean I have the faintest
idea what to change in Automake to make that be the case. You probably
know better than me :). Can you provide (as simple as possible) test cases?


I have since backed away from this approach because I saw that the 
clever way that Automake handles conditionals is not going to be smart 
enough to make language decisions based on if a file is added or 
removed.  The compiler/linker type to use is already baked into 
Makefile.in and the choice used is also based on the optional source 
files.  If a C++ file appears, the language is promoted from C to C++ 
(--tag=CXX) and is baked into Makefile.in.


I see code like this appearing in Makefile.in:

Magick___lib_libGraphicsMagick___la_LINK = $(LIBTOOL) $(AM_V_lt) \
    --tag=CXX $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=link \
    $(CXXLD) $(AM_CXXFLAGS) $(CXXFLAGS) \
    $(Magick___lib_libGraphicsMagick___la_LDFLAGS) $(LDFLAGS) -o \
    $@

As such, it seems that this would be a major feature request and 
significant redesign, and is not a simple bug.


I may be possible to replace this Makefile hunk as an override, but it 
may not be reliable.


It does seem important to link with the C++ compiler when depending on 
C++ libraries, but modern Linux apparently works "fine" if you don't.



 I have a continuing problem that when libtool tests

bug-libt...@gnu.org for that one, I think ... --best, karl.
There is a bug open already in libtool for the issue with dropping -lm 
when mixing C and C++.  It is not clear that the C++ linker provides -lm 
by default since it seemed to be pulled in by implicit dependencies as 
part of the libraries implementation.


Bob

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt




C library promoted to C++ linkage due to optional C++ source module

2024-03-09 Thread Bob Friesenhahn
GraphicsMagick (which is primarily C code) supports optional linkage 
with some C++ libraries.  It is my opinion that if a library or program 
is linked with C++ libraries (and especially if it is a static build!) 
that it should be linked using the C++ linker.  Likewise, if a library 
or program does not depend on, or contain any C++ code, it should be 
linked with the C linker.


GraphicsMagick has several major build options which includes putting 
everything in its major library, or separating certain code into 
loadable modules.  Automake conditionals are used to support the several 
major build options.  When loadable modules are used, the C++ dependency 
is moved to the loadable modules and away from the major library.


The method that I am using to coax Automake into linking using the C++ 
linker is to include a technically empty 'acpplus.cpp' file as a 
component when building modules, libraries, or an executable.  This 
approach works, but it has a problem.  The problem is that if Automake 
sees that a C++ source file may be *optionally* included in a target, it 
promotes the linkage for that target to C++.


In my builds I see a library of C files being linked together with the 
C++ linker, even though no C++ code was actually built!


Is there a better solution or fix for this problem?

I have a continuing problem that when libtool tests which libraries are 
supplied automatically by the C++ linker, it detects that the C++ linker 
provides -lm, and thus it strips -lm from the library dependencies 
stored in the libtool library .la file, causing static linkage of C 
programs using that .la file to fail. This means that when the library 
linked with libtool's C++ linker is installed, C programs will fail to 
link with it by default. This issue was already reported as a bug to the 
libtool project.


Mixing C and C++ files together in a project in conjunction with libtool 
seems very hard!


Bob





C library promoted to C++ linkage due to optional C++ source module

2024-03-09 Thread Bob Friesenhahn
GraphicsMagick (which is primarily C code) supports optional linkage 
with some C++ libraries.  It is my opinion that if a library or program 
is linked with C++ libraries (and especially if it is a static build!) 
that it should be linked using the C++ linker.  Likewise, if a library 
or program does not depend on, or contain any C++ code, it should be 
linked with the C linker.


GraphicsMagick has several major build options which includes putting 
everything in its major library, or separating certain code into 
loadable modules.  Automake conditionals are used to support the several 
major build options.  When loadable modules are used, the C++ dependency 
is moved to the loadable modules and away from the major library.


The method that I am using to coax Automake into linking using the C++ 
linker is to include a technically empty 'acpplus.cpp' file as a 
component when building modules, libraries, or an executable. This 
approach works, but it has a problem.  The problem is that if Automake 
sees that a C++ source file may be *optionally* included in a target, it 
promotes the linkage for that target to C++.


In my builds I see a library of C files being linked together with the 
C++ linker, even though no C++ code was actually built!


Is there a better solution or fix for this problem?

Bob





Re: Generating missing depfiles by an automake based makefile

2023-02-10 Thread Bob Friesenhahn

On Fri, 10 Feb 2023, Edward Welbourne wrote:


Dmitry Goncharov (10 February 2023 00:24) wrote:

When a depfile is missing (for any reason) the current automake
makefile creates a dummy depfile.


This seems misguided.
Better to

 include $(wildcard $(DEPFILES))


This sounds like a GNU make feature.  Automake-generated Makefiles do 
not rely on GNU make.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: How to speed up 'automake'

2022-05-02 Thread Bob Friesenhahn

On Mon, 2 May 2022, Jan Engelhardt wrote:


Indeed, if a source code package consists of 1 files, then configure
produces another 10k files for the stuff in the ".deps" directories.
There is not much autotooling can do here, as I believe pregenerating
those 10k files all with "# dummy" content is to support the least common
demoniator of /usr/bin/make.


In a configure script I see the option:

  --enable-dependency-tracking
  do not reject slow dependency extractors
  --disable-dependency-tracking
  speeds up one-time build

so what tangible benefits does --disable-dependency-tracking actually 
provide?


If empty files are ok (assuming they are needed at all), can they be 
produced with a minimum number of executions of a 'touch' command?


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Should Autotools integrate with VCS ignore files?

2022-04-11 Thread Bob Friesenhahn

On Mon, 11 Apr 2022, Горбешко Богдан wrote:


So, should Autotools be improved to detect a known VCS being used and add 
ignore lines for it automatically? Many modern project generators and build 
systems, including Meson and create-react-app, do that to some extent, but I 
doubt if this is actually in the scope of Autotools and follows its ideas.


The decision of which files to commit to the VCS varies 
project-by-project.  Some projects commit all generated files, some 
projects commit certain types of generated files (e.g. omitting ones 
due to Autotools), and others avoid committing any generated content.


If Automake were to provide such a facility, it should be under 
control of the user and/or the project and not completely automatic. 
For example, there could be a Make target which produces the ignore 
file.


The tools would need to work given manually generated ignore content.

Generated documentation is a special class of file which needs to be 
considered since not all recipients of the repository will be able to 
deal with it.  Generated documentation may depend on particular 
software which needs to be at a particular version or configured a 
particular way.


Automake produces a Makefile.  I don't think Automake is specifically 
aware of the purpose of many of the rules which create/update files in 
the source tree (assuming building outside of the source tree) and 
sometimes it is not aware at all due to simply passing through 
user-provided Make fragments.


Lastly, sometimes the files from one VCS are stored within another 
(e.g. git in Mercurial) and the tool would need to know the file and 
the desired syntax (e.g. glob vs regex).


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


Re: Need better release validation documentation/strategy.

2022-04-09 Thread Bob Friesenhahn

On Fri, 8 Apr 2022, ckeader wrote:


The key server network as we knew it is dead and buried, and I would not
expect any of them to provide complete or indeed reliable information.
This article explains why:
https://gist.github.com/rjhansen/67ab921ffb4084c865b3618d6955275f.
There was some discussion at the time over on gnupg-users also.


This was facinating reading, and I was not aware of any of it before. 
Unfortunately, I have not figured out how to follow its advice yet.


Everything related to OpenPGP is extremely obtuse with massive amounts 
of documentation.


OpenSSH 8 and later offer a facility which allows validating a file's 
origin and integrity given a certificate (see 
https://www.agwa.name/blog/post/ssh_signatures). I gave this a try and 
it was remarkably simple.  It is several orders of magnitude less 
complex than OpenPGP and many people use OpenSSH.  Unfortunately, not 
all systems have OpenSSH 8 yet (or will ever have OpenSSH).  Another 
issue is that users could be confused by ".sig" files and won't know 
if they should use OpenSSH or gpg to validate with them without 
looking at the content.



Providing the signer's pub keys on a (secured) web site seems to be the
best option for now.


I have been using several mechanisms, including an insecure URL link 
as is shown in my email signature text.



An important question has not been asked yet, IMHO - why are maintainers
using this relatively obscure method for hashing?


Yes, this is very obscure and it defeats the purpose, which should be 
to encourage verification.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Need better release validation documentation/strategy.

2022-04-08 Thread Bob Friesenhahn

On Fri, 8 Apr 2022, Jim Meyering wrote:


On Fri, Apr 8, 2022 at 6:30 AM Bob Friesenhahn
 wrote:

Today I saw an announcement for a new version of gzip.  It provided
lots of data for how to verify the downloaded tarballs.  I recently
saw a very similar announcement for a new version of libtool. I am not
sure where the template of this announcement text is coming from, and
if anyone has validated that recipients will be able to make sense of
it.

The problem is that the advice in the announcements regarding use of
'gpg' just doesn't work (commands fail),


That was my mistake, and I fixed it last night. We updated the
generated and recommended gpg-key-fetching command to be a wget
command that fetches from savannah. I presumed that it would just
work, but that was not true. I fixed it by adding my gpg key in the
"public information" section of each project for which I'm already
listed as an authorized uploader.


For some reason key servers are not very helpful these days with some 
of them offering distorted behavior or appearing to be severely 
overloaded.  It may be that the key server used by default by Ubuntu 
Linux imposes additional limitations such as regarding exposing email 
addresses. The user might need to know how to specify a different 
server and know which ones to use.


This one failed:

% gpg --locate-external-key j...@meyering.net
gpg: error retrieving 'j...@meyering.net' via WKD: General error
gpg: error reading key: General error

and this one confusingly did not succeed:

% gpg --recv-keys 7FD9FCCB000B
gpg: key 7FD9FCCB000B: new key but contains no user ID - skipped
gpg: Total number processed: 1
gpg:   w/o user IDs: 1

% gpg --verify gzip-1.12.tar.xz.sig
gpg: assuming signed data in 'gzip-1.12.tar.xz'
gpg: Signature made Thu 07 Apr 2022 11:59:54 AM CDT
gpg:using RSA key 155D3FC500C834486D1EEA677FD9FCCB000B
gpg: Can't check signature: No public key

The next problem is this:

% sha256sum --version
sha256sum (GNU coreutils) 8.30
Copyright (C) 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later 
<https://gnu.org/licenses/gpl.html>.

This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

Written by Ulrich Drepper, Scott Miller, and David Madore.

It is possible that newer versions of this utility do support the 
base64 output but this one does not appear to.  This is from Ubuntu 
20.04LTS, which is Ubuntu's current LTS offering.


For the recent libtool announcement, the gpg issues were not identical 
but it was also not possible to retrieve the key using the 
instructions provided.  The libtool maintainer tried and he was not 
able to get the instructions to work either.


It is definitely preferable to verify using gpg so anything which 
makes this easier for users.


I did post my mail to the Automake list since it seems that Automake 
may be able to help make some of these aspects better by providing 
helpful rules and boiler-plate pertaining to signing files and 
verifying that it is possible to validate the signature.


For GraphicsMagick I added rules to Makefile.am so that if I am doing 
a "release" (a distcheck plus a few more steps) the build 
automatically signs the release file and generates sha256 sums.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Need better release validation documentation/strategy.

2022-04-08 Thread Bob Friesenhahn
Today I saw an announcement for a new version of gzip.  It provided 
lots of data for how to verify the downloaded tarballs.  I recently 
saw a very similar announcement for a new version of libtool. I am not 
sure where the template of this announcement text is coming from, and 
if anyone has validated that recipients will be able to make sense of 
it.


The problem is that the advice in the announcements regarding use of 
'gpg' just doesn't work (commands fail), and even the SHA256 checksum 
is described as "SHA256 checksum is base64 encoded" which I was 
previously only seeing from the BSD-oriented OpenSSH project which 
might be using a BSD tool which produces such checksums.


It seems like Automake and GNU in general should be trying to help 
with producing releases and release announcements which assist users 
with verifying the release tarballs rather than just leaving them 
royally confused.


If ordinary people are not able to use the data provided with the 
release announcement, then they will not be validating the tarballs 
that they run-across.  Download statistics suggest that the vast 
majority of source-code tarball downloads are not being validated at 
all.


If 'gpg' commands are provided, then they should be able to work by 
default on popular OS platforms.  Likewise, if a SHA256 checksum is 
provided and something new like "SHA256 checksum is base64 encoded", 
then instructions should be provided for how to use mature GNU tools 
(and/or popular non-GNU tools) to reproduce such a checksum.


While I was able to figure out how to use a combination of openssl and 
base64 to create matching SHA256 checksums, I doubt that most people 
would be willing to spend a half hour researching and figuring out how 
to do this.  I was not able to figure out how to produce a similar 
SHA256 checksum using the GNU software provided by the OS I am using.


I am not sure who the target audience is for GNU releases these days, 
but if it is not normal people who are still willing to compile 
software from source code on popular systems such as GNU/Linux, then 
there is a problem.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Wrong order of preprocessor and compiler flags

2022-03-27 Thread Bob Friesenhahn

On Mon, 28 Mar 2022, Jan Engelhardt wrote:


I went to the GNU make git repo to check on CPPFLAGS; it appeared first in
documentation rather than source (which seems like a history import mishap),
but even back then in '94, the documentation was inconsistent, sometimes
providing example descriptions where CPPFLAGS comes after CFLAGS/FFLAGS/etc.,
and sometimes reversed.


I think that this is because it was always assumed that the order does 
not matter.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: portability of xargs

2022-02-15 Thread Bob Friesenhahn

On Tue, 15 Feb 2022, Paul Smith wrote:


On Tue, 2022-02-15 at 15:37 -0600, Bob Friesenhahn wrote:

I have been told by several people (who have much more self-esteem
than me) that a build tool called 'cmake' is far more portable than
Autotools so maybe we should make support for 32 year old systems
cmake's responsibility?


That is not accurate.  Or at least, cmake uses a much different
definition of "portable" than autoconf / automake.  Certainly cmake
cannot possibly support 32-year old systems (if that's needed).


Thanks for the nice response.  I did not for a minute think that 
what I was told was accurate.


The people who tell me it is more portable are very interested in 
targeting Microsoft Windows.



cmake is a tool that, given an input definition of build dependencies,
can generate build control files for multiple different types of build
tools.  It can generate makefiles, Ninja files, Xcode project files,
and Visual Studio project files.  Maybe others, I'm not sure.


The "Makefiles" that Cmake generates are self-referential in that 
almost all rules invoke Cmake to do the work.



The main idea of autoconf is that it allows configuring the package for
systems that the author never even heard of, much less has access to
(as long as it has a POSIX interface).  cmake only tries to generate
output files for systems it has been developed for (for example, each
new version of Visual Studio often requires a new release of cmake to
support it).


The above is a perfect description of the situation.


Of course maybe autoconf is not necessary anymore: I don't know about
that.  I do know that even though GNU make itself is relatively simple,


I find that the function of Autoconf is quite useful.  When compared 
with Cmake, it is much more user-friendly since the configure script 
responds to --help and the help output is very helpful.  The configure 
script nicely tells me what it is doing and what it found.  Autotools 
is very structured and projects work in a common way whereas I find 
that Cmake projects are heavily customized with options added by the 
project developer.  Just doing a build entirely outside of the source 
tree is extremely clumsy with Cmake.


Regardless, I don't think that Autotools developers should waste time 
on assuring compatability with systems which are no longer in active 
use.  It is much better so spend time improving areas where Autotools 
is weak.


If 'xargs' has worked consistently for 20 years, it should be ok to 
use.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: portability of xargs

2022-02-15 Thread Bob Friesenhahn
It it really expected that Autotools should support 32 year old 
systems?


This feels counter-productive to me.

I have been told by several people (who have much more self-esteem 
than me) that a build tool called 'cmake' is far more portable than 
Autotools so maybe we should make support for 32 year old systems 
cmake's responsibility?


I am fond of systems from the early/mid '90s but it seems better to 
support POSIX compliant systems with newer software.


People who have somehow figured out how to keep old hardware running 
without disk drives, electrolytic capacitors, or fans, can install 
older GNU software first in order to bootstrap sufficiently to a 
sufficient level of POSIX compliance.


The well-built systems I bought prior to 2007 are already dead or 
difficult to repair.


I do not see any reason to spend any time at all supporting an OS 
older than 2008.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: portability of xargs

2022-02-15 Thread Bob Friesenhahn

On Mon, 14 Feb 2022, Mike Frysinger wrote:


On 14 Feb 2022 19:53, Paul Eggert wrote:

On 2/14/22 19:45, Mike Frysinger wrote:

how portable is xargs ?


It can be a porting problem, unfortunately. There are several corner
cases that various implementations don't get right. I expect this is why
the GNU Coding Standards exclude xargs from the list of programs that
'configure' and Makefile rules can use.


are the corner cases known ?  if it's such that xargs doesn't always correctly
limit itself to the system limit based on other factors, i can live with that
assuming that the -n option is reliable.


This morning I read this discussion thread and did not see any actual 
problems with current (on systems that people are still using) xargs 
portability mentioned.


Microsoft Windows (e.g. POSIX environments which run under Windows 
such as Cygwin, MSYS, etc.) is surely an existing corner case which 
demands more attention than archaic systems.  It seems that the 
command line length when using Microsoft Windows may depend on the 
number of bytes in the arguments (not the number of arguments) as well 
as the number of bytes in the current environment variable data.


A problem with xargs is that without using the GNU -O or --null 
argument and null-terminated arguments, file names containing spaces 
won't be handled properly.  File names containing spaces is an issue 
for Autotools in general.  This is again an issue under Microsoft 
Windows where users typically are provided with directory paths which 
contain a space and they need to take additional administrative 
measures in order to provide directory paths which work with 
Autotools.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



bug#10828: rm -f # no more args failure?

2021-12-14 Thread Bob Friesenhahn

On Mon, 13 Dec 2021, Karl Berry wrote:


   bf> I thought that systems deriving from OpenSolaris (e.g. Illumos,
   ...
   However, testing in an empty directory on a system without the
   upated ksh93 this looks ok to me:

Bob, what you wrote before (approx. a year ago) is here:
https://debbugs.gnu.org/cgi/bugreport.cgi?bug=42529

Ending with:
 Behavior of ksh93 (which has a bug) appears to depend on the PATH
 setting so it will behave differently if /usr/xpg4/bin or
 /usr/xpg6/bin appear in the path.

Maybe that is why you didn't see the behavior just now?


Yes, that must be the case.  I updated my PATH to avoid the behavior.


In any case, if a broken "rm -f" still exists in a free system as late
as last year, it seems premature to me to force this change now.


It does still exist in many deployed systems which have not yet been 
updated.  In a year or two there will be fewer systems which lack the 
updated ksh93.



I grant Moritz's point that the ubiquitious "test -z ... || rm ..."
adds noise when trying to understand Automake recipes, but IMHO that is
not enough reason to induce this incompatibility.


The cost is more a matter of lost execution time due to the fork/exec 
than annoying noise.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt





Re: rm -f # no more args failure?

2021-12-14 Thread Bob Friesenhahn

On Mon, 13 Dec 2021, Karl Berry wrote:


   bf> I thought that systems deriving from OpenSolaris (e.g. Illumos,
   ...
   However, testing in an empty directory on a system without the
   upated ksh93 this looks ok to me:

Bob, what you wrote before (approx. a year ago) is here:
https://debbugs.gnu.org/cgi/bugreport.cgi?bug=42529

Ending with:
 Behavior of ksh93 (which has a bug) appears to depend on the PATH
 setting so it will behave differently if /usr/xpg4/bin or
 /usr/xpg6/bin appear in the path.

Maybe that is why you didn't see the behavior just now?


Yes, that must be the case.  I updated my PATH to avoid the behavior.


In any case, if a broken "rm -f" still exists in a free system as late
as last year, it seems premature to me to force this change now.


It does still exist in many deployed systems which have not yet been 
updated.  In a year or two there will be fewer systems which lack the 
updated ksh93.



I grant Moritz's point that the ubiquitious "test -z ... || rm ..."
adds noise when trying to understand Automake recipes, but IMHO that is
not enough reason to induce this incompatibility.


The cost is more a matter of lost execution time due to the fork/exec 
than annoying noise.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: rm -f # no more args failure?

2021-12-12 Thread Bob Friesenhahn

On Sun, 12 Dec 2021, Karl Berry wrote:


Does anyone here use or know of an active system where plain
 rm -f
with no arguments fails? I mean, exits with bad status?


I thought that systems deriving from OpenSolaris (e.g. Illumos, 
OpenIndiana, etc.) had that issue until ksh93 was fixed recently. 
However, testing in an empty directory on a system without the 
upated ksh93 this looks ok to me:


$ rm -f ; echo $?
0
$ rm -f * ; echo $?
0

so the issue that Automake encountered before must not be exactly 
that.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake for RISC-V

2021-11-21 Thread Bob Friesenhahn

On Sun, 21 Nov 2021, Billa Surendra wrote:


With these steps I am not able to install. Error message is "aclocal-1-15
not found".


This means that Autools generated files (a collection of packages, one 
of which is Automake) are being regenerated for some reason.  In this 
case, it is having a problem because either Autoconf is not installed, 
or the installed version is not the same as used to produce existing 
cached files in the source tree.


If using a release tarball as the source, then this will only happen 
if the build rules specifically request it, or if the underlying 
filesystem does not replicate file timestamps correctly.


When dealing with other existing build systems, it is not uncommon to 
find added build rules to re-autotool the source tree in order to pick 
up


Bob



On Sun, 21 Nov, 2021, 5:38 pm Peter Johansson,  wrote:



On 21/11/21 16:32, Billa Surendra wrote:

Then how-to install automake in target image.



Download the tarball from https://ftp.gnu.org/gnu/automake/

unpack and follow the instructions in INSTALL; typically somethings like

./configure

make

sudo make install


Peter






--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake for RISC-V

2021-11-19 Thread Bob Friesenhahn

On Thu, 18 Nov 2021, Billa Surendra wrote:


Dear All,

I have cross-compiled Automake-1.16.2 package with RISC-V cross compiler,
but when I am executing binaries on RISC-V target OS image its gives errors
like "not found".


Is there a reason why you need to "cross compile" Automake for the 
target rather than add the few pre-requisites to the target and native 
"compile" Automake there.  The pre-requisites needed to actually 
execute the "compiled" Automake on the target are similar to what is 
needed to cross-compile it so I am not seeing a win by trying to cross 
compile a script-based build tool.


If you had a working cross-compiler for RISC-V then you could compile 
the software for the target (e.g. using Automake) without needing to 
add build tools to it.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: `make dist` fails with current git

2021-10-13 Thread Bob Friesenhahn

On Wed, 13 Oct 2021, Zack Weinberg wrote:


Looks like some kind of problem with automatic ChangeLog generation?


To me this appears to be the result of skipping an important step in 
what should be the process.  It seems that there should be a 
requirement that 'make distcheck' succeed prior to a commit.  Then 
this issue would not have happened.


Of course make distcheck' takes a long time, and it is possible that 
for Automake results depend partially on what has already been 
committed and not just what is in the current source tree.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Project "calf-studio" fails with Warning: Linking the executable calfmakerdf against the loadable module

2021-08-27 Thread Bob Friesenhahn

On Fri, 27 Aug 2021, Yuri wrote:


The Calf Studio project fails with these messages:

|*** Warning: Linking the executable calfmakerdf against the loadable module 
*** calf.so is not portable! and subsequent "undefined reference to" symbols 
in this library. The GitHub issue


Most Unix-type systems put 'lib' in front of the name (e.g. 
libcalf.so) and that is what is expected for what compilers/linkers 
link against.


There may be more issues than this.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



RE: Future plans for Autotools

2021-06-23 Thread Bob Friesenhahn

On Thu, 13 May 2021, FOURNIER Yvan wrote:


Hi Karl,

Regarding the possible addition of a libtool option to ignore .la 
files, I would need to take a deeper look into how libtool works (I 
have only scratched the surface and experimented with it as a "black 
box" so far, but If I do get around to it, I will definitely send a 
patch or post it here).


It would be useful to know why .la files are "incorrect".

The most common concern is that libtool links using full dependencies 
(evaluated using information from the .la files) rather than relying 
on a target's implicit dependency support.  This means that libraries 
may be "over-linked" and thus record undesirable (e.g. to the 
distribution maintainers) dependency information.


If recorded dependency information in a .la file becomes incorrect 
because an implicit dependency was re-built, or an alternate 
implementation library was substituted, then that creates a problem.


Libtool is first and foremost a portability tool.  It also helps 
support static linkage.


Solving the library depencency issue properly (while not relying on 
non-portable implicit dependency support or 3rd party software) might 
require more controls/knobs for libtool as well as an update to the 
.la file format so it is clear which dependencies are direct, and 
which are simply dependencies of the direct dependencies.


If the ability to static link is not important, or if only 
using/supporting GNU Linux with GCC then libtool is not needed.


GCC itself knows how to create shared libraries on popular targets.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: parallel build issues

2021-06-21 Thread Bob Friesenhahn

On Mon, 21 Jun 2021, Werner LEMBERG wrote:


Now I want to do better :-) However, the problem persists:
`ttfautohintGUI` gets built in the `frontend` directory, and there is
a dependency on the binary in the `doc` directory.  Is there a clean
solution for that?


A non-recursive build (a single Makefile for everything) is quite 
valuable and avoids such problems.  However, it requires more care to 
construct it.


It may feel more efficient as a developer to cd to a subdirectory in 
order to execute just part of a build, but in my experience this is 
usually a false savings due to a recursive build being more wasteful. 
It is possible to build just part of the software using a 
non-recursive build but it may be confusing unless you add dummy 
target names to make things easier.


It is possible to insert additional dependency lines in Makefile.am so 
software is always built in the desired order, but this approach might 
only work if you always build using the top level Makefile.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Future plans for Autotools

2021-05-06 Thread Bob Friesenhahn

On Thu, 6 May 2021, Andy Tai wrote:


a general question:  would a rewrite in Python or some other language,
to keep the same functionality as the current implementation, a viable
goal, or that would not be a productive thing to do?


There are several major aspects of Automake.  One is the use of Perl 
scripts to transform a user-provided high-level description into a 
portable Makefile.  Another is the design and syntax of the Makefile 
itself.  There are also m4 Autoconf macros to integrate Automake into 
Autoconf scripts.


Supporting a feature such as two targets from the same source file 
apparently is aided by depending on GNU Make features so the result 
would no longer be a portable Makefile.  It has yet to be decided to 
require use of GNU Make.


Perl is not necessarily more difficult to read and understand than 
Python but some popular Perl syntax may be very difficult for new 
users to understand whereas Python mostly avoids such syntax.


Regardless, the current major problem is now how difficult the 
Automake software is to maintain.  The major problem is that there is 
no one fully dedicated to maintain it.  All work appears to be done on 
an intermittent part-time basis.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Future plans for Autotools

2021-05-06 Thread Bob Friesenhahn

On Thu, 6 May 2021, Karl Berry wrote:


(*) https://lists.gnu.org/archive/html/automake/2021-03/msg00018.html
So far the response has been nil.


I don't recall seeing that email.  I did see an email thread regarding 
Autoconf which immediately became a lot of "need to support this soon" 
and "wouldn't it be nice" discussion, totally missing the point of the 
text they were reponding to (i.e. "project is on the gurney and may 
expire without help").


Regardless, "baby bird" syndrome seems to be striking a great many 
established free software projects which are mature and heavily used.


Projects operated by billion dollar companies with teams of developers 
paid to sit in a cubicles and write free sofware seem to be doing 
fine.


Unfortunately, this is not the model for most GNU projects.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Constantly changing libtool

2021-04-15 Thread Bob Friesenhahn

On Thu, 15 Apr 2021, Laurence Marks wrote:


In the same way as autoXZY sets up Makefiles in an OS independent fashion,
there should be a way to autoupdate autoXYZ files for each system without
user intervention. (I don't mean automake itself or similar, I do mean only
the local files.) In an ideal world this would be a macro in
configure.ac


When dealing with updated software there might not be any viable 
approach other than to install the various Autotools packages using a 
common installation prefix.


Sometimes executing

  autoreconf --install --force

using an existing Autotools installation is a viable solution.

Given that all of the software you are using is already released and 
sometimes very archaic, you are going to need to deal with the 
situation in front of you.  There is not going to be some new release 
which is somehow going to solve this issue to your satisfaction.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Constantly changing libtool

2021-04-15 Thread Bob Friesenhahn

On Wed, 14 Apr 2021, Laurence Marks wrote:


It is not timestamp issues, it is version issues -- e.g. libtool 2.4.2
versus 2.4.6 (for instance, those are just invented numbers). In many cases
aclocal, libtool, ltmain.sh and maybe a few others do not work if they were
created with a different autoXYZ version than is on the computer where they
are being installed. It seems to be "special" to libtool.


There has not been a libtool release since 2014 and it is 2021 
already.


The issues you are complaining about should not be existing for quite 
a long time already, and are easily corrected by installing consistent 
Autotools versions under the same installation prefix.  If you don't 
like the archaic versions your system provides, then simply uninstall 
(or not use) those packages and install using current Autotools 
release versions.


It is intended that libtool components are installed into the software 
which uses them.  This assures consistent versions.  If the libtool 
components were not installed and distributed with the package, then 
there could be problems as you describe.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Constantly changing libtool

2021-04-14 Thread Bob Friesenhahn

On Wed, 14 Apr 2021, Laurence Marks wrote:


I have some software which has been "fine" for about 15 years, with one
exception: every 6-12 months or so when I (or students/others) compile it
we run into version problems. The typical ones are version mismatch between
libtool and/or aclocal (called from libtool). While I can futze around with
libtoolize/aclocal/autoconf/automake etc until it works, this is not viable
for users who barely understand linux.

Is there some way in configure.ac (Makefile.am) to setup so the relevant
version fixes can be done automatically? There are so many posts about
version mismatches that this should be there. Maybe it is and I just missed
it.


Most problems seem to stem from packages providing the generated files 
from Autoconf, Automake, and libtool so that end users don't need to 
have the tools available.  This would be combined with file time-stamp 
issues.  Many source control systems do not preserve file time-stamps.


If the operating system distribution (or some other source) provides 
versions which differ from what the package provided, then there will 
be a problem.


One thing that can help quite a lot is to include AM_MAINTAINER_MODE 
in configure.ac so that the user needs to add the 
--enable-maintainer-mode option in order to enable self-maintenance 
rules.  If the user does not provide the option, then the existing 
files will be used, which is usually just fine unless a file has 
actually been modified.


It is sometimes said that 'autoreconf --force' will solve all 
problems, but in my experience this is not always true.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: How to autotools figure what library need to be linked to?

2021-03-30 Thread Bob Friesenhahn

On Tue, 30 Mar 2021, Peng Yu wrote:


I am confused about what belongs to autoconf and what belongs to automake.

For the usually configure -> make -> make install, what belongs to
autoconf what belongs to automake?


It is reasonable to be confused about this unless one reads the 
documentation.  Most elements used to prepare the configure script are 
from Autoconf (https://www.gnu.org/software/autoconf/manual/) but some 
parts pertaining to production (and maintenance) of the 
Autoconf/Automake/Libtool parts are handled by Autoconf macros from 
Automake (https://www.gnu.org/software/automake/manual/).


It is definitely worth reading some of the documentation.  I see that 
section 2 of the Autoconf documentation specifically answers your 
question.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: How to autotools figure what library need to be linked to?

2021-03-30 Thread Bob Friesenhahn

On Mon, 29 Mar 2021, Peng Yu wrote:


Hi,

crypt(3) does not need -lcrypt on macOS, but need -lcrypt on Linux.
How does autotools determine what library is need to be linked based
on the source code? Thanks.


This is really an Autoconf (configure script) issue and not an 
Automake issue.  The common approach using Autoconf is to test various 
known permutations (starting with nothing, and then adding crypt) 
until linking succeeds.


I expect that there is already a macro from the Autoconf macro archive 
which handles this case.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: config.sub/config.guess using nonportable $(...) substitutions

2021-03-09 Thread Bob Friesenhahn

On Mon, 8 Mar 2021, Paul Eggert wrote:

Except maybe for Solaris 10, shells that don't grok $(...) are museum pieces 
now. And anybody on Solaris 10 (which occasionally includes me, as my 
department still uses Solaris 10 on some machines) can easily run a better 
shell like /bin/ksh. It's a bit of a maintenance hassle for Ben to deal with


The script itself does start with requesting /bin/sh.  The 
config.guess script is also used in contexts outside of Autotools, 
although such usage can likely specify the shell to use when executing 
the script.


At some point, failing to support $(...) is in the same ballpark as failing 
to support "#". I can see Ben's point of view that we've reached that point 
even if I would have waited another three years, so if Ben would rather use 
$(...) I'd rather not overrule him downstream.


It seems that config.guess and Autotools packages are picking winners 
and losers.  It is not clear where the bar has been set.  When such 
decisions are made, it is useful if messaging about it is very clear.


If any component of a build infrastructure is not portable to an older 
system, then all of the components (including each dependent 
application's own configure.ac script) might as well be updated to use 
the less-portable yet more convenient modern syntax and this implies 
the ability to safely use other modern syntax as well.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: config.sub/config.guess using nonportable $(...) substitutions

2021-03-08 Thread Bob Friesenhahn

On Mon, 8 Mar 2021, Nick Bowler wrote:


These scripts using $(...) are incorporated into the recently-released
Automake 1.16.3, which means they get copied into packages bootstrapped
with this version.  So now, if I create a package using the latest bits,
configuring with heirloom-sh fails:


This is bad and has immediate impact (and now already existing due to 
including in releases) to some projects.  Due to Autotools sometimes 
taking a long time between releases, some projects (including two I 
make releases for) update these scripts as part of their normal 
release process.


It is assumed that the scripts will remain portable to almost any 
system.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake's file locking

2021-02-03 Thread Bob Friesenhahn

On Wed, 3 Feb 2021, Zack Weinberg wrote:

Therefore I like the idea of merely relying on the atomicity of
file creation / file rename operations.

These files should reside inside the autom4te.cache directory. I would
not like to change all my scripts and Makefiles that do
  rm -rf autom4te.cache


Agreed.  The approach I'm currently considering is: with the
implementation of the new locking protocol, autom4te will create a
subdirectory of autom4te.cache named after its own version number, and
work only in that directory (thus preventing different versions of
autom4te from tripping over each other).  Each request will be somehow
reduced to a strong hash and given a directory named after the hash
value.  The existence of this directory signals that an autom4te
process is working on a request, and the presence of 'request',
'output', and 'traces' files in that directory signals that the cache
for that request is valid.  If the directory for a request exists but
the output files don't, autom4te will busy-wait for up to some
definite timeout before stealing the lock and starting to work on that
request itself.


This seems like a good approach to me.

There is substantially less danger from independent reconfs (on the 
same or different hosts) than there is from parallel jobs in the 
current build deciding that something should be done and trying to do 
it at the same time.


GNU make does have a way to declare that a target (or multiple 
targets) is not safe for parallel use.  This is done via a 
'.NOTPARALLEL: target' type declaration.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-28 Thread Bob Friesenhahn

On Thu, 28 Jan 2021, Nick Bowler wrote:


If I understand correctly the issue at hand is multiple concurrent
rebuild rules, from a single parallel make implementation, are each
invoking autom4te concurrently and since file locking didn't work,
they clobber each other and things go wrong.


That is what would happen, but what currently happens is if the file 
locking does not work and a parallel build is used, then Autotools 
reports a hard error:


CDPATH="${ZSH_VERSION+.}:" && cd /home/bfriesen/src/graphics/GM && 
/bin/sh '/home/bfriesen/src/graphics/GM/config/missing' aclocal-1.16 
-I m4
autom4te: cannot lock autom4te.cache/requests with mode 2: Invalid 
argument

autom4te: forgo "make -j" or use a file system that supports locks
aclocal-1.16: error: autom4te failed with exit status: 1
gmake: *** [Makefile:4908: /home/bfriesen/src/graphics/GM/aclocal.m4] 
Error 1


In my case there is only one active developer so there would not be 
actual corruption.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-28 Thread Bob Friesenhahn

On Thu, 28 Jan 2021, Zack Weinberg wrote:


Do you use different versions of autoconf and/or automake on the
different clients?


No.  That would not make sense.  If a client is not suitably prepared, 
then I don't enable maintainer mode.



The lock appears to be taken speculatively since it is taken before
Autotools checks that there is something to do.

...

The most common case is that there is nothing for Autotools to do
since the user is most often doing a 'make' for some other purpose.


It looks to me like the lock is taken at exactly the point where
autom4te decides that it *does* have something to do. It might be


Perhaps this experience is a side effect of my recent experience 
(regarding AC_INIT and versioning) and not the normal case.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-28 Thread Bob Friesenhahn

On Thu, 28 Jan 2021, Zack Weinberg wrote:


The main reason I can think of, not to do this, is that it would make
the locking strategy incompatible with that used by older autom4te;
this could come up, for instance, if you’ve got your source directory
on NFS and you’re building on two different clients in two different
build directories.  On the other hand, this kind of version skew is
going to cause problems anyway when they fight over who gets to write
generated scripts to the source directory, so maybe it would be ok to
declare “don’t do that” and move on.  What do others think?


This is exactly what I do.  I keep the source files on a file server 
so that I can build on several different types of clients.  This used 
to even include Microsoft Windows clients using CIFS.


The lock appears to be taken speculatively since it is taken before 
Autotools checks that there is something to do.  It would be nicer if 
Autotools could check first if there is something to do, acquire the 
lock, check if there is still something to do, and then do the work.


The most common case is that there is nothing for Autotools to do 
since the user is most often doing a 'make' for some other purpose.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-25 Thread Bob Friesenhahn

On Mon, 25 Jan 2021, Zack Weinberg wrote:


Automake "just" calls Perl's 'flock' built-in (see 'sub lock' in
Automake/XFile.pm) (this code is copied into Autoconf under the
Autom4te:: namespace).  It would be relatively straightforward to
teach it to try 'fcntl(F_SETLKW, ...)' if that fails.  Do you know
whether that would be sufficient?  If not, we may be up a creek, since
depending on CPAN modules is a non-starter.


I expect that it would be that "simple" except for of course 
everything involved with making sure that things are working properly 
for everyone.


It may be that moving forward to 'fcntl(F_SETLKW, ...)' by default and 
then falling back to legacy 'flock' would be best.  Or perhaps 
discarding use of legacy 'flock' entirely.


Most likely the decision as to what to do was based on what was the 
oldest primitive supported at the time.  The GNU/Linux manual page 
says that "the flock() call first appeared in 4.2BSD".  It also says 
"Since Linux 2.6.12, NFS clients support flock() locks by emulating 
them as fcntl(2) byte-range locks on the entire file.".  There are a 
number of warnings in the manual page regarding the danger of mixing 
locking primitives.  It was never intended that flock() work over a 
network share.


It seems unlikely that Autotools development is going to be done on a 
system lacking POSIX locking since such a system would not be 
considered a usable system for most purposes.  If a project does not 
provide a 'maintainer mode' to stop maintainer rules from firing, then 
this could impact archaic targets from the early '90s.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Future plans for Autotools

2021-01-20 Thread Bob Friesenhahn

On Wed, 20 Jan 2021, Zack Weinberg wrote:


Now we've all had a while to recover from the long-awaited Autoconf
2.70 release, I'd like to start a conversation about where the
Autotools in general might be going in the future.  Clearly any future
development depends on finding people who will do the work, but before
we worry about that I think we might want to figure out what work we
_want_ to do.


Zack, your text was excellent and I agree with all of it.

Autotools is in great danger of becoming irrelevant, at least for new 
software development.  A lot of people feel hostile toward it.


It seems to me that Autoconf is too difficult to work on.  There is 
too much to become expert at in order for new volunteers to make an 
impact.  The same is true for libtool.


In my opinion, a new "language" designed specifically to meet the 
needs of Autoconf should be developed and Autoconf should be 
re-implemented using it.  There should be no more need to run external 
utilities like 'sed', or 'awk' or other things which can be 
implemented internally in a way which does not suffer from the white 
space and delimiter issues that shell script does.


It seems that the core precept that Automake should produce portable 
makefiles should be re-evaluated if most systems provide GNU make, or 
can easily be updated have it.  There is a fork of Automake which was 
re-done to be based on GNU Make. This assumes that makefiles written 
in GNU make can noticeably improve things.


I like your idea of supporting other underlying build tools besides 
'make'.  Make's dependence on matching rules and file time stamps is 
problematic and it does not scale.  It is unfortunate that GNU 
produced a much more powerful 'make' tool (a paradigm invented in 
1976), but not a new build tool with fresh ideas to improve build 
quality and reduce build times on modern systems.


The macro definitions provided by Autoconf have been proven by the 
test of time and allow the underlying implementation to be changed. 
Only surrounding shell script might need to be changed if the 
underlying implementation changes.


The support for 'distcheck' is excellent.

Regardless, thanks for your ideas and the red alert.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Getting srcdir in script executed using m4_esyscmd in AC_INIT

2021-01-05 Thread Bob Friesenhahn

On Tue, 5 Jan 2021, Zack Weinberg wrote:



Something I found which surprised me is that Automake has added a GNU
COPYING file to my non-GNU non-GPL package.  Once the file appeared in
the source directory, it seems that it continued to be used and
included in spite of the Automake options provided.

While I see more detail sprinkled around the generated files, it seems
that this COPYING file may be the biggest issue.


Hmm.  If you remove COPYING, make sure that you have `foreign` in
either `AUTOMAKE_OPTIONS` or the options list passed to
`AM_INIT_AUTOMAKE`, and then re-run automake --add-missing, does
COPYING come back?  My understanding is that automake --add-missing is
supposed to install a copy of the GPL *only if* it's in `gnu` mode and
can't find any other statement of licensing.


The COPYING file did not come back.  I now notice that an INSTALL file 
was added as well.


As far as I am aware, at no point were the Automake options not 
expressed in some way (via configure.ac or Makefile.am) but I used 
Autotools automatic maintenance after editing files (via make) and 
perhaps they were added during a transition point.


Regardless, updating a project to use the recommended AC_INT strategy 
is not without some danger since it is more rigid than the previous 
shell-variable based approach.  This is particularly the case for the 
project version and how the project version is applied when creating 
the tarball.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



bug#42529: configure: error: Your 'rm' program is bad, sorry.

2020-11-21 Thread Bob Friesenhahn

On Fri, 20 Nov 2020, Karl Berry wrote:


Hi Bob - (sorry for the delayed reply)

   On an OpenIndiana system without the GNU utilities
   ...
   Usage: rm [-cFdfirRuv] file ...]

So, it seems that system's rm does not accept "rm -f" (with no file
operands)? (I don't have access to any such system to check.)  If so, it
seems the error message is correctly reporting the state of things,
including the possible workaround, so I don't see anything to do?
Closing this now. Feel free to reopen (or just reply) if there is
something more automake should be doing. Thanks for the report. -k


The "system's" rm in this case is ksh93's built-in rm emulation.

As we all know, the shell attempts to expand arguments before passing 
them to a command.


Behavior of ksh93 (which has a bug) appears to depend on the PATH 
setting so it will behave differently if /usr/xpg4/bin or 
/usr/xpg6/bin appear in the path.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt





Re: Warning category skew between Autoconf and Automake - workaround in autoreconf?

2020-09-10 Thread Bob Friesenhahn

On Thu, 10 Sep 2020, Zack Weinberg wrote:


Clearly ChannelDefs.pm should be brought back into sync between the
two projects, but that will only fix the problem after *both* Autoconf
and Automake do a release.  So I’m wondering whether it would make
sense to merge this distributor’s patch to avoid supplying -Wcross to
automake -- perhaps generalized to arbitrary warning categories.  What
do you think?


Has it ever been an expectation that Autoconf and Automake are of a 
similar vintage?  Even if a newer Automake is released, it is not 
required to be used.


It seems like the patch may be required.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


bug#42529: configure: error: Your 'rm' program is bad, sorry.

2020-07-25 Thread Bob Friesenhahn
On an OpenIndiana system without the GNU utilities in the executable 
search path, configure of autoconf-2.69b fails spectacularly.  This 
failure was actually not unexpected since it is due to a ksh93 
built-in bug which was never fixed.  The problem does not occur if I 
do this (forcing use of bash, which is in the executable search path):


CONFIG_SHELL=/usr/bin/bash /home/bfriesen/src/gnu/autoconf-2.69b/configure

This is what actually happened:

weerd:~/build/autoconf-2.69b% /home/bfriesen/src/gnu/autoconf-2.69b/configure
checking for a BSD-compatible install... /usr/bin/ginstall -c
checking whether build environment is sane... yes
checking for a race-free mkdir -p... /usr/bin/gmkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
Usage: rm [-cFdfirRuv] file ...
Oops!

Your 'rm' program seems unable to run without file operands specified
on the command line, even when the '-f' option is present.  This is contrary
to the behaviour of most rm programs out there, and not conforming with
the upcoming POSIX standard: <http://austingroupbugs.net/view.php?id=542>

Please tell bug-automake@gnu.org about your system, including the value
of your $PATH and any error possibly output before this message.  This
can help us improve future automake versions.

Aborting the configuration process, to ensure you take notice of the issue.

You can download and install GNU coreutils to get an 'rm' implementation
that behaves properly: <http://www.gnu.org/software/coreutils/>.

If you want to complete the configuration process using your problematic
'rm' anyway, export the environment variable ACCEPT_INFERIOR_RM_PROGRAM
to "yes", and re-run configure.

configure: error: Your 'rm' program is bad, sorry.

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt





Re: How to install data / lib in more than 1 place?

2019-12-10 Thread Bob Friesenhahn

On Tue, 10 Dec 2019, Georg-Johann Lay wrote:


Hi, thanks.  That was fast!


avrfoo_LIBRARIES = libfoo.a

avrbar_LIBRARIES = libbar.a


Will this also work with same file names? Like

avrfoo_LIBRARIES = libfoo.a

avrbar_LIBRARIES = libfoo.a

or would that confuse the tools?


I am not sure.  I would assume so even though the documentation 
suggests support for only installing libraries to two standard 
locations.  GNU coding standards (and Autotools) proposes and supports 
certain standards, but it is possible to go beyond those standards.


It should also be possible to provide different CFLAGS and CPPFLAGS 
for objects built for those libraries (e.g for different CPU targets), 
even if they start from the same source files.


Give it a try and find out!

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: How to install data / lib in more than 1 place?

2019-12-10 Thread Bob Friesenhahn

On Tue, 10 Dec 2019, Georg-Johann Lay wrote:


Hi, I have an automake file with rules for a lib like this:

__install_dir = $(prefix)/avr/lib/$(AVR_INSTALL_DIR)

avrdir = $(__install_dir)
avrlibdir = $(__install_dir)

avr_LIBRARIES = libdev.a
nodist_libdev_a_SOURCES = ...

What would be the recommended way to install libdev.a also in several 
sub-folders like in $(avrlibdir)/dir1 $(avrlibdir)/dir2 etc.?


Currently, there are a bunch of SUBDIRS which are all generating the same 
binary, it's just that they are installed in a slightly different place.


It seems that Automake supports a 'dir' syntax which allows using the 
standard target suffixes (e.g. _DATA) in an arbitrary way.  This works 
for me:


# Pkgconfig directory
pkgconfigdir = $(libdir)/pkgconfig

# Files to install in Pkgconfig directory
pkgconfig_DATA = foo.pc

This should allow you to have

avrfoodir = $(prefix)/avr/lib/foo

avrbardir = $(prefix)/avr/lib/bar

and then possibly this might work:

avrfoo_LIBRARIES = libfoo.a

avrbar_LIBRARIES = libbar.a

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: BUILT_SOURCES called on `make dist` even if the built sources should not be included in the dist

2019-09-18 Thread Bob Friesenhahn

On Wed, 18 Sep 2019, Jerry Lundström wrote:


With v1.16 this step is executed during `make dist` and using
`EXTRA_DIST` for that whole directory would also mean that _the built
library_ would be included in the dist.


`EXTRA_DIST` only goes so far.  In my own project I use `dist-hook` to 
bundle up foreign subdirectories which I am too lazy to fully 
describe or which are too fluid to bake into Makefile.am.  For 
example:


# Non-Automake subdirectories to distribute
DISTDIRS = locale scripts www PerlMagick TclMagick
dist-hook:
( \
  builddir=`pwd` ; \
  cd $(srcdir) && \
  ( \
for dir in $(DISTDIRS) ; do \
  find $$dir -depth -print | egrep -v 
'(~$$)|(/\.hg)|(/\.#)|(/\.deps)|(\.pyc)' \
| cpio -pdum $$builddir/$(distdir) 2> /dev/null ; \
done \
  ) \
        )

--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


Re: BUILT_SOURCES called on `make dist` even if the built sources should not be included in the dist

2019-09-17 Thread Bob Friesenhahn

On Tue, 17 Sep 2019, Jerry Lundström wrote:


I can't really see how this change got approved, isn't the point of
BUILT_SOURCES to be sources built when building!?  Including them into
distributions seems wrong.


I see considerable documentation regarding BUILT_SOURCES at 
"https://www.gnu.org/software/automake/manual/html_node/Sources.html;


It seems that there is a chicken and egg issue that BUILT_SOURCES is 
designed to solve.


"The BUILT_SOURCES variable is a workaround for this problem. A source 
file listed in BUILT_SOURCES is made on ‘make all’ or ‘make check’ (or 
even ‘make install’) before other targets are processed. However, such 
a source file is not compiled unless explicitly requested by 
mentioning it in some other _SOURCES variable."


Unfortunately, I think that it the BUILT_SOURCES requirement which 
makes my non-recursive build still a bit recursive.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


Re: Automake 1.16.1: problem with non-gnu make and gnulib

2019-08-25 Thread Bob Friesenhahn

On Sat, 24 Aug 2019, Assaf Gordon wrote:


Hello Bob,

On 2019-08-24 5:26 p.m., Bob Friesenhahn wrote:

On Sat, 24 Aug 2019, Assaf Gordon wrote:


hello_LDADD =  $(top_builddir)/lib/lib$(PACKAGE).a
datamash_LDADD =  $(top_builddir)/lib/lib$(PACKAGE).a

This seems like a bug in those two packages.  It should never be desirable 
to refer to the build directory other than by using the knowledge that the 
build is done using the current directory.


Why do you say so?
Is there a reference to somewhere authoritative with such recommendation?


I should qualify that as it is not desirable to refer to the build 
targets with this syntax.  There are other valid uses other than to 
refer to build targets (e.g. to refer to project include directories 
and other non-target files).  There is nothing authoritative, and it 
is just my own opinion.



"top_builddir" is supposed to point to the build directory,
and similarly there's "abs_top_builddir", as well as some others:
https://www.gnu.org/software/autoconf/manual/autoconf-2.63/html_node/Preset-Output-Variables.html

Why shouldn't they be used?


Adding unnecessary syntax to refer to sources and targets does not 
seem helpful.  Indeed, looking at Makefile.am files I have written, 
the source files in the source directory are listed without an added 
${srcdir} component and targets are listed without any added 
${builddir} or ${top_builddir}.  It would be unhelpful to add such 
syntax and it seems likely Automake would not accept it.


Unfortunately, the 'hello' program is supposed to be a reference example of 
the right things to do.


That,
and the fact it it was working fine for many years and automake versions is 
the reason I ask why do you think it should not be used.


I do think that Automake should support the syntax (only because it 
worked before) and that old Makefiles should continue working. 
Supporting and promoting are two different things.  It is useful if 
there is a warning when unnecessary syntax is used.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt


Re: Automake 1.16.1: problem with non-gnu make and gnulib

2019-08-24 Thread Bob Friesenhahn

On Sat, 24 Aug 2019, Assaf Gordon wrote:

And indeed,

GNU hello and GNU datamash (which exhibit the problem)
have something like this in their Makefile.am:

hello_LDADD =  $(top_builddir)/lib/lib$(PACKAGE).a
datamash_LDADD =  $(top_builddir)/lib/lib$(PACKAGE).a

While sed and coreutils (which don't have the problem) have:

 sed_sed_LDADD = sed/libver.a lib/libsed.a
 LDADD = src/libver.a lib/libcoreutils.a

So because of the "$(top_builddir)" the targets gets a "./"
prefix, and combined with that recent automake change,
they ended up being considered separated targets by "bmake".


This seems like a bug in those two packages.  It should never be 
desirable to refer to the build directory other than by using the 
knowledge that the build is done using the current directory.



So the immediate fix/work-around is to remove any path from
the libraries listed in xxx_LDADD in the projects themselves.

However,
This change (regression?) seems to come from automake, perhaps
consider a bugfix for future versions.


It would be good if Automake can warn about unnecessary use of 
$(top_builddir) and make any necessary corrections.


Unfortunately, the 'hello' program is supposed to be a reference 
example of the right things to do.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Help with -Werror

2019-04-24 Thread Bob Friesenhahn

On Wed, 24 Apr 2019, Phillip Susi wrote:



For the most part, -Werror is a developer tool which will only cause
problems for users, so my strong recommendation is that it should
never appear in package releases, but not everybody subscribes to
that philosophy...


I am tempted to switch it off... on the other hand, the previous
maintainer probably had a good reason for putting it in, so I would
prefer to just switch it off for BUILT_SOURCES.  Any idea how to do
that?


One possibility would be to add configure support for 
--enable-maintainer-mode and then change the defaults when it is 
enabled.  Another possiblity is to add a configure option to enable 
the option, which developers should always enable.


Without -Werror baked in, a maintainer/developer should normally 
strive to eliminate any warnings which do appear.  These are not 
usually difficult to notice when the build is in non-verbose mode.


Unless you have the ability and commitment to test 100% of the 
permutations under which your software will be built, then you should 
not use -Werror in the default build since many users will not know 
what to do if the build fails.


Typically developers use recent vintages of operating systems and 
compilers and they have little exposure to somewhat older operating 
systems and compilers where a warning might appear.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Stopping unit test on failed partial ordering dependency

2019-04-24 Thread Bob Friesenhahn

On Tue, 23 Apr 2019, Kip Warner wrote:


Now suppose the TestStartDaemon.sh bails for some reason. There is no
point on doing all the queries and having them all fail. There has to
be a sane way of handling this scenario.


Having all the dependent tests fail seems ok, as long as the failure 
is expedient.  Unless the tests are reasonably ordered, it may not be 
obvious to the user why so many tests are failing.


If there are normally 1200 tests run but the user sees only two tests 
run, and one of them fails, then the user might not be aware of the 
many tests which have not been executed.  The tests which fail still 
need to be reported as failed.


One reason why I like the TAP testing framework is that a test script 
has execution context across multiple tests (avoiding repeated 
stop/start), which improves control over the test, and executes more 
efficiently.  You can start your server at the beginning of the test 
script and then execute 500 tests on it via the same script, while 
terminating the server at the end of the script.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: Stopping unit test on failed partial ordering dependency

2019-04-23 Thread Bob Friesenhahn

On Mon, 22 Apr 2019, Kip Warner wrote:


How can I solve this problem?


By using the TAP test framework you could organize your related tests 
into ordered scripts which assure that test construction/destruction 
is orderly even if some tests fail.  This approach may lose 
parallelism if there are not enough independent TAP tests to keep the 
system busy.  TAP tests do need to produce the expected number of test 
report messages, even if a prerequisite has failed.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: AC_ARG_ENABLE and checking for unrecognized switches

2019-03-15 Thread Bob Friesenhahn

On Fri, 15 Mar 2019, Kip Warner wrote:


https://www.gnu.org/software/autoconf/manual/autoconf-2.68/html_node/Option-Checking.html

My reading is that there *is* checking by default, but it is turned
off if you have a subdir configure, but then can be turned back on
again by the user.


Good eye, Tom. The only problem here is my configure.ac doesn't use a
AC_CONFIG_SUBDIRS because there are no child configure.ac scripts to
run, so by default that checking should have been enabled. It's
possible I could be doing something wrong though.


A project can be made subordinate to another project without the 
author of the subordinate project being aware of it.  This is a very 
useful capability.  This capability is used by projects such as GCC.


The Autotools philosophy is to provide as much freedom as possible to 
the end user while working in a consistent way.  This would include 
the case where a project is created which builds several other 
projects.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: automake 1.16

2018-12-26 Thread Bob Friesenhahn

On Wed, 26 Dec 2018, David wrote:


here I sit, after hours of finding and adding lib's. thinking I am ready to 
compile automake 1.16
cd ~/tocompileautomake

error this file was made for automake 1.16 you have 1.14who is the super genius 
who decided this? how is the software supposed to be compiled if ya need the 
software to compile itself?why?


Perhaps you failed to follow the provided instructions or you are the 
first person in the entire world who has encountered this problem.


How did you acquire Automake 1.16?  Was it from a release tarball or 
from a git clone.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Public Key, http://www.simplesystems.org/users/bfriesen/public-key.txt



Re: pkg-conf and LD* variables

2018-10-28 Thread Bob Friesenhahn

On Sat, 27 Oct 2018, Harlan Stenn wrote:


pkg-conf offers the following ways to get "link flags":

--libs  All link flags
--libs-only-L  The -L/-R stuff
--libs-only-l   The -l stuff
--staticOutput ilbraries suitable for static linking

and then says the union of -only-L and -only-l may be smaller than the
output of -libs because the latter might include flags like -rdynamic.

We're already having to parse the output of pkg-config to try and
separate LDADD stuff from LDFLAGS stuff,  And this is before we start to
address static v. dynamic things.

This is because we really don't want -L stuff in LDADD, because we end
up with things like:

No rule to make: -L /usr/local/lib


I see the same problems.  It would have been useful if the inventors 
of pkg-config read the GNU coding standards, the GNU make manual, and 
the GNU Autoconf and Automake manuals, before they started.


As soon as header files and libraries need to be found in multiple 
directories the wheels come off since there is total loss of control 
of where the files are searched for.


A common scenario for me is that newer libraries are in one directory 
tree (e.g. under /usr/local) with older libraries in other directory 
trees (e.g. part of an OS distribution).


PKG_CONFIG_PATH is not powerful enough given that the options are 
muddled together.


GraphicsMagick also makes sure that options are put in the correct 
build variables.  I don't see a way to accomplish this other than via 
parsing pkg-config output and trying to make the best of it.


The GNU standard is that the user (the person controlling the build) 
should have control over the configuration.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: automatically showing test-suite.log on failure?

2018-09-30 Thread Bob Friesenhahn

On Sat, 29 Sep 2018, Karl Berry wrote:


   I might be missing something, but I get that behavior in my automatic
   builds by calling 'make check VERBOSE=1'.

Yes! Thank you!!


This does not appear to have any effect when using the TAP framework.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



RE: PTHREAD_CFLAGS provided by AX_PTHREAD where best to be attached?

2018-09-14 Thread Bob Friesenhahn

On Fri, 14 Sep 2018, Dudziak Krzysztof wrote:


Actually question asked initially here does not seem to exist as preprocessing 
then compilation is one single step. Both ...CFLAGS and ...CPPFLAGS should go 
to that building stage.
Eventually the order matters, i.e. their's placement in gcc command line string.

Question asked initially might exist if to replace ax_pthread-generated 
variables with gcc's -pthread option as question's objective.
gcc's manual describes it as gcc process driver's option. Latter one translates 
it and passes it to preprocessor, compiler and linker flags.
But how to use -pthread in Makefile.am if makefile carries out building in two 
steps, compilation then linking (with libtool in between
which possibly is irrelevant here)?
I believe automake-generated makefile by default work that way.


Libtool attempts to use the C compiler when linking, if at all 
possible.  In this case the C compiler is also supplied with CFLAGS 
content.  If the C compiler is not used for linking, then libtool 
needs to assure that options passed to the linker are compatible with 
it, and avoid passing options which might cause it to fail.


Libtool does remember if the -pthread option (or equivalent) was 
supplied due to storing this information in the ".la" files it writes.


It is not good to only refer to GCC since there are many other 
compilers.  For some compilers (e.g. on IBM AIX), a different compiler 
program needs to be selected in order to support pthreads.


Bob



krzysiek

-Original Message-----
From: Bob Friesenhahn [mailto:bfrie...@simple.dallas.tx.us]
Sent: Thursday, 12. July 2018 20:28
To: Dudziak Krzysztof 
Cc: automake@gnu.org
Subject: Re: PTHREAD_CFLAGS provided by AX_PTHREAD where best to be attached?

On Thu, 12 Jul 2018, Dudziak Krzysztof wrote:


Hi,
For makefiles which build piece of software in two separate steps:
compilation then linking, will it be better to attach  PTHREAD_CFLAGS to 
higher-level_CPPFLAGS than to higher-level_CFLAGS variable?
Autotools along with libtool are used here, user-level variables (CFLAGS, 
CPPFLAGS) are not manipulated through whole build chain.
I can manipulate variables only at Makefile.am -level.


CPPFLAGS is for the pre-processor while CFLAGS is applied (as well as
CPPFLAGS) while compiling.  This means that putting the options in CFLAGS does 
work, although it would be better to deduce which options are for the 
pre-processor or the compiler and put them in the correct places.


I learned gcc expects -pthread option which translates among others to
some preprocessor define, therefore I wonder if it won't be better to
attach PTHREAD_CFLAGS to higher-level_CPPFLAGS rather than to 
higher-level_CFLAGS.


It should not work to put -pthread in CPPFLAGS since it is a compiler-driver 
option and not a pre-processor (e.g. /usr/bin/cpp) option.

Only, -I, -D, and -U type options (see the manual page for 'cpp') should be put 
in CPPFLAGS.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, 
https://emea01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.simplesystems.org%2Fusers%2Fbfriesen%2Fdata=02%7C01%7CKrzysztof.Dudziak%40gemalto.com%7C2cfd69413ad6444ed36708d5e8254424%7C37d0a9db7c464096bfe31add5b495d6d%7C1%7C1%7C636670169138687886sdata=kqlLph71vuW%2BBZ0PKC5ew2Xb%2FFRRIIzTrCyMuvFU2Sk%3Dreserved=0
GraphicsMagick Maintainer,
https://emea01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.GraphicsMagick.org%2Fdata=02%7C01%7CKrzysztof.Dudziak%40gemalto.com%7C2cfd69413ad6444ed36708d5e8254424%7C37d0a9db7c464096bfe31add5b495d6d%7C1%7C1%7C636670169138687886sdata=Fii3vstOw5Se1obwRwfN41COxKBnqZrNbpZ2aGl1e2o%3Dreserved=0

This message and any attachments are intended solely for the addressees and may 
contain confidential information. Any unauthorized use or disclosure, either 
whole or partial, is prohibited.
E-mails are susceptible to alteration. Our company shall not be liable for the 
message if altered, changed or falsified. If you are not the intended recipient 
of this message, please delete it and notify the sender.
Although all reasonable efforts have been made to keep this transmission free 
from viruses, the sender will not be liable for damages caused by a transmitted 
virus.





--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: automatically showing test-suite.log on failure?

2018-09-12 Thread Bob Friesenhahn

On Wed, 12 Sep 2018, Karl Berry wrote:


However, this seems like it would be fairly commonly useful and easy
enough to do in the canonical test-driver script. So, any chance of
adding it as a standard feature? Any reasonable way of enabling it would
be fine, e.g., a flag that can be added to [AM_]LOG_DRIVER_FLAGS.


Take care since some test logs could be megabytes in size.


BTW, as with that post, I'm primarily interested in this because of
automated build environments where all that is (easily) seen is the log.
Secondarily, in a big build tree, and with srcdir!=builddir, it can be
annoying just to navigate to the correct test-suite.log file. Thus it
would be nice to just have it up front.


It would be good to be able to enable this via a standard configure 
script option.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: PTHREAD_CFLAGS provided by AX_PTHREAD where best to be attached?

2018-07-12 Thread Bob Friesenhahn

On Thu, 12 Jul 2018, Dudziak Krzysztof wrote:


Hi,
For makefiles which build piece of software in two separate steps: compilation 
then linking,
will it be better to attach  PTHREAD_CFLAGS to higher-level_CPPFLAGS than to 
higher-level_CFLAGS variable?
Autotools along with libtool are used here, user-level variables (CFLAGS, 
CPPFLAGS) are not manipulated through whole build chain.
I can manipulate variables only at Makefile.am -level.


CPPFLAGS is for the pre-processor while CFLAGS is applied (as well as 
CPPFLAGS) while compiling.  This means that putting the options in 
CFLAGS does work, although it would be better to deduce which options 
are for the pre-processor or the compiler and put them in the correct 
places.



I learned gcc expects -pthread option which translates among others to some 
preprocessor define,
therefore I wonder if it won't be better to attach PTHREAD_CFLAGS to 
higher-level_CPPFLAGS rather than
to higher-level_CFLAGS.


It should not work to put -pthread in CPPFLAGS since it is a 
compiler-driver option and not a pre-processor (e.g. /usr/bin/cpp) 
option.


Only, -I, -D, and -U type options (see the manual page for 'cpp') 
should be put in CPPFLAGS.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Makefile.in, LIBTOOL and shared/static builds.

2018-06-25 Thread Bob Friesenhahn

On Mon, 25 Jun 2018, or...@fredslev.dk wrote:


I’m curious - it’s neat that slibtool exists, but if you need
functionality offered by libtool then why not just use libtool?


Frankly libtool is 12,000+ line shell script which is hard to
understand, maintain, fix and is rather slow.


Agreed.


A while ago with my 6 core AMD FX-6350 cpu I tested building
libtomcrypt with both libtool and slibtool using six make jobs. This
program does heavily use libtool, but does not use autotools.


[ results stuff removed ]

The overhead attributed to libtool seems rather high.  Is there 
something about your execution environment or your libtool usage which 
causes more overhead than usual?



Here with a simple hello world I am seeing 1682 system calls with
libtool and 345 with slibtool.


Agreed.  A shell script will always use more system calls and 
overhead than a simple utility since shell scripts are comprised of 
many calls to simple (or complex) utilities.



One example I recall is where I found a configure.ac which was
incorrectly using sed on the output of some internal commands and
feeding the compiler bogus flags. Libtool silently swallowed the bug
and moved on where slibtool helpfully printed the error, allowed the
bug to be found and then fixed.


How does slibtool validate arguments?  Does it understand the specific 
set of arguments allowed by the compiler/linker being used?



Additional benefits are that installing the .la files is optional (Most
programs will not break if they are missing) and that there is only one
slibtool binary for all the supported targets and target flavors.


What are the supported targets and target flavors?

Does slibtool discard most of the portabilty (host OSs and toolchains) 
offered by the libtool script?



My understanding is the problem here is that slibtool works at a lower
level than libtool and as a result is not able to rely on what is not
in files like Makefile.in as libtool can. This should be relatively
easy to fix in automake (?) with making it possible to obtain this
information easier.


Usually it is possible to substitute Makefile text to replace the 
default rules used by Automake.  This should allow changing how 
Automake invokes the tool in order to pass additional arguments.  Have 
you tried that?


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: automake compile problems

2018-05-30 Thread Bob Friesenhahn

On Wed, 30 May 2018, Eric Blake wrote:


On 05/30/2018 04:07 AM, Andy Armstrong wrote:

Hi all,


I want to compile version 1.16 of automake.


On which system?


Must be IBM, and perhaps z/OS:

https://www.ibm.com/support/knowledgecenter/SSLTBW_2.3.0/com.ibm.zos.v2r3.bpxa800/fsum9432.htm

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Python2 and Python3 checks

2018-03-22 Thread Bob Friesenhahn

On Thu, 22 Mar 2018, Matěj Týč wrote:


On 21.3.2018 22:34, Bob Friesenhahn wrote:

On Wed, 21 Mar 2018, Matěj Týč wrote:

The question stands like this: Is there a demand on automake side to fix 
this issue - to allow developers addressing multiple Python interpreters 
of different major versions? If so, I think that I can come up with some 
patches to get this working.


Is there a purpose to this macro from an Automake (creating Makefiles) 
standpoint?  Does Automake offer special Python module building support 
which depends on the Python version?


Majority of packages that use Autotools are C/C++ libraries, and they may 
want to build bindings for Python. As Python2 is the only supported Python 
e.g. in RHEL7, but it is also possible to obtain Python3, there will be 
demand for this for years and years to come as for the developer, supporting 
bindings for multiple Python major versions is a relatively cheap task.


RHEL7 is very conservative.  I have heard that some major 
distributions are now standardizing/defaulting to Python3, although 
they allow installing Python2.


The ability to specify a maximum version sounds useful but it is difficult 
to foretell the future and a package using the feature might be adding an 
artificial limitation which eventually leads to failures because the 
requested version range is no longer in use.
Again, the main target group is library developers providing bindings. 
Projects with lots of Python code will not use Autotools at all. Therefore, 
it is not so much about capping the version, but it is about distinguishing 
that there may be more than one major Python version that needs to be dealt 
with in the build and install process.


You make a good point that it is possible that a package will want to 
discover multiple Python versions and build extensions for each 
version found.  Is this something you would like to support?


The question to be answered is if updating Automake's macro is the 
best course (requiring updating Automake to the version providing the 
macro in order to use it) or if a macro in something like the Autoconf 
macro archive is the safer approach.


A stable distribution may not want to update the Automake version for 
a few years but they might also want to re-autotool packages while 
building them.  In this case, a cached .m4 file with the macro will 
still work while depending on a macro from a latest Automake won't 
work because the distribution has chosen not to be up to date.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: Python2 and Python3 checks

2018-03-21 Thread Bob Friesenhahn

On Wed, 21 Mar 2018, Matěj Týč wrote:

The question stands like this: Is there a demand on automake side to fix this 
issue - to allow developers addressing multiple Python interpreters of 
different major versions? If so, I think that I can come up with some patches 
to get this working.


Is there a purpose to this macro from an Automake (creating Makefiles) 
standpoint?  Does Automake offer special Python module building 
support which depends on the Python version?


If the issue is just to find a particular Python interpreter version 
(an Autoconf configure task), then it is likely that there are other 
macros already existing which do this better.


Autotools is in it for the long haul since parts of it have been in 
use since 1994.  The issue of Python 2 vs Python 3 will eventually be 
resolved and forgotten.  Anything added today to solve what should be 
a short term problem will be an encumberance or problem in the future.


The ability to specify a maximum version sounds useful but it is 
difficult to foretell the future and a package using the feature might 
be adding an artificial limitation which eventually leads to failures 
because the requested version range is no longer in use.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: manual: Why use 'maude' as the example program name?

2018-02-25 Thread Bob Friesenhahn
I think that we should have respect for the author's dog. 
Disrespecting the author's dog is not far from disrespecting the 
author.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: _SOURCES files in sub-directories lead to make distdir failure

2018-01-24 Thread Bob Friesenhahn

On Wed, 24 Jan 2018, netfab wrote:


Le 24/01/18 à 14:13, Bob Friesenhahn a tapoté :

Have you made sure that the distribution tarball is packaging up this
header file?


How do I check this ? I do not have any tarball.
Make distcheck should create the tarball, and unpack it to run the
build into separate directory.


Make distcheck will create a tarball (the 'make dist' part) before it 
moves on to subsequent steps.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: _SOURCES files in sub-directories lead to make distdir failure

2018-01-24 Thread Bob Friesenhahn

On Wed, 24 Jan 2018, netfab wrote:


Ok, killed SUBDIRS :


https://framagit.org/netfab/GLogiK/commit/c3a88a89b486aa45e9f6b462e11b4f2c96812709

But, still the same error when running make distcheck :


make[2]: Entering directory '/home/netfab/dev/projects/GLogiK/build'
 make[2]: *** No rule to make target 'src/lib/dbus/messages/GKDBusMessage.h ', 
needed by 'distdir'. Stop.


Have you made sure that the distribution tarball is packaging up this 
header file?


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: MinGW and "warning: implicit declaration of function _spawnv"

2017-11-14 Thread Bob Friesenhahn

On Tue, 14 Nov 2017, Jeffrey Walton wrote:


What does spawnv have to do with _XOPEN_SOURCE?  Isn't spawnv a Microsoft
Windows-specific function and _XOPEN_SOURCE is a Unix thing?


I think spawn is Posix, and its for systems that don't have fork.
posix_spawn and _spawn are deprecated on MS platforms. They use an ISO
C++ version called spawnl. Also see
https://docs.microsoft.com/en-us/cpp/c-runtime-library/reference/spawnl-wspawnl.


I was not aware of that.


Its been my experience that when something Posix goes missing when
using Newlib the first thing to try is _XOPEN_SOURCE=500 or
_XOPEN_SOURCE=600. _XOPEN_SOURCE sets _POSIX_SOURCE and a couple of
others, like some X11 defines.


The problem with _XOPEN_SOURCE and _POSIX_SOURCE is that in addition 
to enabling features, they also disable all the features which are not 
defined by that standard.  If the OS does not support the requested 
version, then none of the expected features are available.  Usually 
there is a better way.


Regardless, Windows only conformed to the 1989 version of POSIX so 
that they could say that Windows NT was compliant with the 
specification so that they could sell it into goverment contracts.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: MinGW and "warning: implicit declaration of function _spawnv"

2017-11-14 Thread Bob Friesenhahn

Or maybe something else?


It looks like the extra files provided by libtool are the problem. It
looks like the libtool folks need to add _XOPEN_SOURCE=600 to their
preprocessor definitions. I'm guessing they only test with a C
compiler, and not a C++ compiler and Newlib combination.


What does spawnv have to do with _XOPEN_SOURCE?  Isn't spawnv a 
Microsoft Windows-specific function and _XOPEN_SOURCE is a Unix thing?


You are on the right track that CFLAGS options or additional 
pre-processor definitions are likely needed.


It is my understanding that the MinGW headers and library may support 
multiple CRT versions but the default is the CRT version assured to 
come with any Windows system, which leaves many functions out.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Should Automake still support Java?

2017-10-30 Thread Bob Friesenhahn
I do not see the point in supporting compiled Java in Automake.  The 
whole point of Java was that it can run in a VM.  GNU support for 
compiled Java seems to have faltered.  Although much useful work was 
done, people just did not start using compiled Java.  The free 
software world seems to have adopted OpenJDK 
(https://en.wikipedia.org/wiki/OpenJDK, http://openjdk.java.net/) and 
this is not even likely supported by Automake.


As someone who attempted to completely support Automake's Java tests, 
compiling everything from scratch (including the compiler), I can 
attest that it is a major PITA.  I did mostly succeed but there was a 
Java runtime extracted from Eclipse that I never did get working 
correctly given that there was no useful documentation for how to make 
it work.


Please note that I am not a Java developer.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: automake: tap-driver.sh: cleanup and revival.

2017-10-27 Thread Bob Friesenhahn

On Fri, 27 Oct 2017, Warren Young wrote:


On Oct 27, 2017, at 7:19 AM, Bob Friesenhahn <bfrie...@simple.dallas.tx.us> 
wrote:


On Fri, 27 Oct 2017, Warren Young wrote:


The operating system has a database mapping what my terminal can do to a common 
API.  Let the library handle it.


I should point out that the "operating system" does not necessarily have what 
you describe.


Yes, I’m aware that I’m handwaving away the question of how a shell 
script talks to ncurses or whatever.  (Too bad POSIX never nailed 
that one down.)


Yes, and the current build might be the ncurses build (possible 
chicken and egg situation).


In the classic self-hosted build situation the platform should only 
need to provide a certain level of POSIX and other classic standards 
support.  Then it should be possible to build the rest of the 
operating system and user applications starting from that point. 
This is how the GNU Project got to where it is today.


It should not be necessary to have a fully-populated system in order 
to build software.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: automake: tap-driver.sh: cleanup and revival.

2017-10-27 Thread Bob Friesenhahn

On Fri, 27 Oct 2017, Warren Young wrote:


This thread is about color.  My Link MC3+ doesn’t even *have* color, 
but it will do bold, underline, and blink.  Why should Autoconf care 
what my terminal can do?  The operating system has a database 
mapping what my terminal can do to a common API.  Let the library 
handle it.


I should point out that the "operating system" does not necessarily 
have what you describe.


Autoconf and Automake support many environments, including some 
which lack a terminal database.


Indeed, Autoconf and Automake may be configuring/building software 
which adds support for a terminal database.


This does not mean that it can't be supported on systems which do have 
terminal information, but it makes the task somewhat more challenging.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: automake: tap-driver.sh: cleanup and revival.

2017-10-25 Thread Bob Friesenhahn

On Wed, 25 Oct 2017, Mathieu Lirzin wrote:


As for the portability of ANSI terminal escape codes, it’s still best
to delegate such things to curses or libraries like it, despite the
near ubiquity of ANSI-family terminal emulators.


Colors are already automatically used when possible [1] and can be
disabled with the AM_COLOR_TESTS environment variable.


Yes, and in fact I am seeing colors in my TAP test output and the 
colors are readible using my current terminal background.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: automake: tap-driver.sh: cleanup and revival.

2017-10-25 Thread Bob Friesenhahn

On Wed, 25 Oct 2017, Warren Young wrote:


On Oct 25, 2017, at 8:56 AM, Bob Friesenhahn <bfrie...@simple.dallas.tx.us> 
wrote:



It's also crazy that "--color-tests=y" or "--color-tests=1" won't work


While I like color in photos and nature, automatic colorization of 
output often causes problems for me since it often renders the text 
unreadable.


Why, exactly?


There are always assumptions about the terminal colors.  It is often 
assumed that the terminal background is black but I use a light color 
background.  Bright greens and cyans are virtually illegible for me.


Not all displays behave the same so what looks good one one display 
may be illegible on another.


I ask because the default color scheme of some terminal emulators 
makes the dark blue on black text difficult for me to read, but 
that’s easily fixed by switching the color scheme.


Switching the color scheme takes valuable time.  When one visits many 
different remote systems using various client systems, it is a losing 
proposition to attempt to customize colors.


Well-behaved programs (e.g. GNU ls --color=auto, colordiff…) 
suppress color output when stdout is not a terminal.  They do that 
by making calls like isatty(3) if written in C or test(1) -t if 
written in shell.


I sometimes make a typescript using 'script' and in this case colors 
will be output.  Other times I use 'tee' to capture output while still 
seeing what is going on.


As for the portability of ANSI terminal escape codes, it’s still 
best to delegate such things to curses or libraries like it, despite 
the near ubiquity of ANSI-family terminal emulators.


Agreed.  Yet, a mismatch wastes time.  I would rather use something 
which works 100% of the time and looks less pretty than something 
which works 80% of the time but can be made to look immaculate.


Defaulting to being usable 100% of the time works for me. :-)

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: automake: tap-driver.sh: cleanup and revival.

2017-10-25 Thread Bob Friesenhahn

On Tue, 24 Oct 2017, Mike Mestnik wrote:


I'm reading this script closely because I'd like to make a few changes
and am surprised it hasn't received any edits since 2012.  I think the


Perhaps it has been working perfectly since 2012.  Or perhaps no one 
has volunteered to improve it.


I (using this script since October, 2012) did not realize that there 
was a problem with the script until today.



output is missing ratio of *test's failed, currently only file level
stats are presented not test level, and also the following options
from prove.

--verbose Print all test lines.
--failuresShow failed tests.

* This to me is a key feature of TAP and this script silently discards
the most useful parts of the protocol.


It would be useful if there was more test summary output.


Somethings I'v spotted that are driving me nuts.
http://git.savannah.gnu.org/cgit/automake.git/tree/lib/tap-driver.sh#n100
Clearly indicates that "--color-tests=yes" is the only way to enable
color, there is no auto-detecting a PTY or any support for the
documented "--color-tests=always"

It's also crazy that "--color-tests=y" or "--color-tests=1" won't work
and like wise "--comments" "--no-comments" seems out of place or
rather a better way to implement Boolean arguments.


While I like color in photos and nature, automatic colorization of 
output often causes problems for me since it often renders the text 
unreadable.  It also introduces terminal escape codes into the text 
output, which can cause issues if captured output is later displayed 
on something other than a ANSI terminal.


If there is color support, it should be disabled by default and 
enabled by the user.



Is there much interest in keeping this script the way it is or can I
lobotomize and release it under a new name?  Suffix of ng is popular
these days, tapng-driver.sh. or tap-driverng.sh


It is normal for projects which use this script to copy the script 
into their own source tree.  In this case you are free to name it 
anything you like.


If you have implemented improvements then you should submit a patch to 
the Automake project.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Automake 1.16

2017-09-17 Thread Bob Friesenhahn

On Sun, 17 Sep 2017, Thomas Martitz wrote:


Hello,

are there any plans to release Automake 1.16?

Currently, there is still a nasty bug in 1.15 that is fixed in the 
development code (I'm talking about the "foo_SOURCES containing variables 
like $(srcdir)/foo.c" one). Also, I would like to see a release that ships my 
shorter object file name work. Both issues are raised from time to time, so a 
release would be welcome.


At last, 1.15 is rather old by now. A fresh release would be a good sign of 
life.


I just used ftp to ftp.gnu.org and used 'ls -ltr' of the 
/pub/gnu/automake directory to check upload dates and see that the 
latest Automake is 1.15.1 dated June 19, 2017.  This is not very old. 
Does it still have the bug which is causing a problem for you?


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Warning - non-POSIX variable name

2017-08-09 Thread Bob Friesenhahn

On Tue, 8 Aug 2017, Warren Young wrote:


On Aug 7, 2017, at 11:39 PM, Dharmil Shah <dysha...@gmail.com> wrote:


I followed your suggestions and hence I am able to compile my package using
the llvm compiler.


I hope you chose option 3.

On re-reading that email, I think option 2 is wrong, or at least, it isn’t what 
I meant to write:

   $ ./configure CC=../../llvm-arm-toolchain-ship/3.8/bin/clang

The main problem with option 2 is that it requires you to give that long 
command on each manual re-configure.  It’s better to code the clang-seeking 
rules into an autoconf macro, if only because you will eventually move on to 
Clang 3.9+.


Passing a relative path to CC seems error prone since it only applies 
to the current working directory and will fail as soon as any Makefile 
recurses.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: How to add my own rules?

2017-07-07 Thread Bob Friesenhahn

On Fri, 7 Jul 2017, Victor Porton wrote:


I found:

all-local: ...
...

clean-local: ...
...


You can also override any rule generated by Automake with your own 
rule with the same target name.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Creating a link with automake

2017-01-20 Thread Bob Friesenhahn

On Fri, 20 Jan 2017, Bernhard Seckinger wrote:


I've got a program, that contains some php-script frontend (cli not web)
(and other stuff, which is irrelevant here). I've put the php-scripts into
$pkgdatadir. Now I'd like to have a link from $bindir to the main script i.e.

ln -s ${pkgdatadir}/croco.php ${bindir}/croco

To do this I've added to the Makefile.ac the following:

install-exec-local:
   mkdir -p ${bindir}
   ln -s ${pkgdatadir}/croco.php ${bindir}/croco

When using "make install" this works. But when I run "make distcheck" I get an
error, telling that I'm not allowed to create the ${bindir}. I've allready
tried to replace the mkdir command with

   ${srcdir}/../install-sh -d ${bindir}

which is probably architecture-independend, but I still get a similar error.

Does anyone know, how to do this?


You need to add support for the DESTDIR environment variable, which 
specifies an alternate directory path to install into.  Perhaps this 
will work:


install-exec-local:
mkdir -p ${DESTDIR}${bindir}
ln -s ${DESTDIR}${pkgdatadir}/croco.php ${DESTDIR}${bindir}/croco

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Automatic recompile when flags change

2016-10-20 Thread Bob Friesenhahn

On Thu, 20 Oct 2016, Pauli . wrote:


In many cases partial build with differing flags can results to hard
to debug runtime bugs. To avoid issues when compiler flags change
build system could automatically rebuild files when flags change. But
so far automake doesn't support flag change checking


Why do compiler flags change?  Did you re-run configure with different 
CPPFLAGS/CFLAGS or are these flags induced on the make command line or 
via environment variables?


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Automake passing CFLAGS or CXXFLAGS when linking

2016-07-01 Thread Bob Friesenhahn

On Tue, 28 Jun 2016, Grégory Pakosz wrote:


Hello,

What's the rationale behind Automake passing CFLAGS or CXXFLAGS when linking?

LINK = $(LIBTOOL) $(AM_V_lt) --tag=CC $(AM_LIBTOOLFLAGS) \
 $(LIBTOOLFLAGS) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \
 $(AM_LDFLAGS) $(LDFLAGS) -o $@


Notice that these lines all start with $(LIBTOOL).  Libtool will 
normally discard flags not needed for linking.  It is common for 
libtool to link using the C compiler when possible and so the C 
compiler can also use/discard options as needed.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/


Re: [libtool] make install fails after an incremental build after a prefix change?

2015-12-30 Thread Bob Friesenhahn

On Wed, 30 Dec 2015, Kees-Jan Dijkzeul wrote:


On Tue, Dec 29, 2015 at 9:14 PM, Gavin Smith <gavinsmith0...@gmail.com> wrote:


You want not to rebuild files that don't need to be rebuilt.


Although I tend to agree, we may differ on opinion on the importance
of this. I'd argue that it is much more important to not forget
rebuilding files that actually needed to be rebuilt. The only good
thing I can say about the current behaviour is that, at least, it is
not failing silently.


In order to successfully function, autotools must make several basic 
assumptions.  For example, it must assume that the compilation 
environment does not change since the configure script was run, and it 
must assume that the configuration is not changed while there are 
already built files present.  Automake depends on portable make 
constructs so only so much can be done.


Your concern is unlikely to be addressed so it is wise to deal with 
the current behavior and make adjustments to your build processes.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: [libtool] make install fails after an incremental build after a prefix change?

2015-12-29 Thread Bob Friesenhahn

On Tue, 29 Dec 2015, Kees-Jan Dijkzeul wrote:


Hi,

On my buildserver, I'd like to do something along the lines of

# build 1
./configure --prefix=${HOME}/opt/1 && make && make install
# build 2 (incrementally)
./configure --prefix=${HOME}/opt/2 && make && make install

The second make install fails for some libraries, with the message
libtool: install: error: cannot install `' to a directory
not ending in ${HOME}/opt/1

Apparently, libname.la needs to be rebuilt after the prefix changed,
but automake didn't generate the rules to do so.

Any tips? Is this at all possible?


Try adding 'make clean' to your build steps.

The best thing to do is to build outside of the source tree and use a 
separate build directory for each configure incantation.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: [gnu-prog-discuss] Automake dist reproducibility

2015-12-22 Thread Bob Friesenhahn

On Tue, 22 Dec 2015, Pádraig Brady wrote:


On 22/12/15 17:00, Mike Gerwitz wrote:

There is ongoing discussion about reproducible builds within GNU.  I'm
having trouble figuring out the best approach for deterministic
distribution archives using Automake.


I've not thought much about this, but I'm
wondering about how useful deterministic tarballs are?

The main thrust of reproducible builds is to verify what's
running on the system, and there are so many variables
between the tarball and build, that I'm not sure it's
worth worrying about non determinism in the intermediate steps?

Perhaps the main focus for tarballs should just to
ensure they're properly signed.


I would agree that it is the extracted binary contents of the tarballs 
(ignoring artifacts like file timestamps and user ids) which counts. 
Attempting to get archiving tools to produce the same results at 
different times on different machines is close to impossible.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/

Re: Bad directive

2015-10-12 Thread Bob Friesenhahn

On Mon, 12 Oct 2015, G wrote:


Hi,

I'm a newcomer with automake, "thanks" to the following error I have in a 
Makefile.in file:

@AMDEP_TRUE@@am__include@ 
@am__quote@progsrc/plugspecial/$(DEPDIR)/SPECIALExtract-SPECIALExtract.Po@am__quote@


The path between the quotes is correct, why this error, just marked in Eclipse 
Kepler/MinGW/Windows 7/32 bit as "Bad directive"?


The Makefile.in file produced by Automake does not use valid 'make' 
syntax.  The Makefile.in file is converted by configure/config.status 
to produce a Makefile which does use valid 'make' syntax.  Unless 
there is a problem, you should ignore the content of Makefile.in since 
it is an intermediate file.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: What is minimum set of Automake work files needed for distribution on github?

2015-09-28 Thread Bob Friesenhahn

On Mon, 28 Sep 2015, John Calcote wrote:


Thus, the usual technique is to commit the Autotools source files required
by your project, but to NOT commit a configure script. Anyone wanting to
clone your repository is expected to be "developer enough" to know how to
run "autoreconf -i" to create the configure script.


The main problem with this is that not all users wanting to build from 
sources in the repository have the correct autotools versions.  Even 
if they do have the correct autotools versions, the sources may have 
been hacked by an OS Autotools package maintainer to work differently.


Due to these problems, the configure script for my package provides 
the --enable-maintainer-mode option, and I use a method of updating 
the package version which does not require editing configure.ac 
(avoiding re-generating configure).  Whenever someone commits to the 
repository, I rebuild all of the additional generated files enabled to 
be updated via --enable-maintainer-mode.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: LDADD doesn't work as expected

2015-09-21 Thread Bob Friesenhahn

On Mon, 21 Sep 2015, Arthur Schwarz wrote:




I'm trying to use LDADD to reference libgslip.a in another directory. After
building libgslip.a with make, I issue a make check and get the message:

*** No rule to make target '../libgslip/libgslip.a/', needed by
'SlipTest.exe'.  Stop.

The test directory is entered and the Makefile in the test directory is
executed, so the path would seem to be correct but I am missing something. I
have read Section 8.1.2 Linking the program and I think I'm following the
example given, but alas, it just doesn't work.

I have tried to use LDADD and prog_LDADD with the same effect.


Did you mean to use LIBADD?

It may be wise to construct full paths based on Automake-provided 
variables like $(top_builddir) rather than using relative paths.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: include verbatim or (escape %D%) possible?

2015-07-25 Thread Bob Friesenhahn

On Fri, 24 Jul 2015, Thomas Martitz wrote:


So what I'm asking is: Can I somehow raw/verbatim include into 
plugin1/Makefile.am so that %D% refers to plugin1? Or is there some way to 
escape %D% once? Or is there another way to have README refer to the 
plugin1/README file by using the convinience .mk file?


Given that everything ends up in one Makefile, I don't think that 
including any given .mk file more than once is going to work at all. 
The strategy of incuding common .mk files into a per-directory 
Makefile is to support recursive build schemes.


Nothing prevents from using a script which builds specialized .mk 
files for each subdirectory.  A template file can be used with 
substitutions performed on it to localize it for the subdirectory. 
If there are a great many similar subdirectories, the same script can 
write a .mk file with include statements to also include the many 
subdirectory fragments.


When I converted my project to be non-recursive, it was before 
Automake included any special support for includes in non-recursive 
builds.  I hard-coded the sub-directory offset inside each included 
.mk file via a definition and then built all needed definitions based 
on that.  I did not even trust += syntax (due to lack of control over 
order) so everything is explicit.  This was ok since there were not 
hundreds of subdirectories.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: 回复: Re: why forbidding include a sub-makefile.am with absolute path

2015-06-30 Thread Bob Friesenhahn

On Tue, 30 Jun 2015, 远猷 wrote:


thanks for your reply!

but why forbidding “include” a sub-makefiles.am with absolute path?


Automake projects are expected to be self-contained and not refer to 
anything outside of the package.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/

Re: Integration of a Perl XS module with an Automake build system

2015-06-30 Thread Bob Friesenhahn

On Tue, 30 Jun 2015, Gavin Smith wrote:


Hello all,

I wonder if anyone could give pointers to advice on how to integrate a
Perl XS module with an Automake build system? Over at the Texinfo
project we're starting to replace parts of makeinfo with loadable
native-code modules, for speed.

The problem is that the Makefiles produced by ExtUtils::MakeMaker (the
module used by the XS tutorial, man perlxstut) doesn't follow the
standards required by Automake. (Info node (automake)Third-Party
Makefiles, and the GNU Coding Standards). For example, make clean
actually removes the Makefile.


In addition to behavior issues, a major problem is that the re-link 
capabilities offered by libtool are not available.  If one links from 
the Perl module to a shared library in the build tree, then there is 
the risk of continued reference to the build tree after installation 
(a security/reliability issue) and the module might not even run. 
The Perl test suite is likely to use the wrong shared libraries while 
testing due to GNU/Linux's strong preference to using 
already-installed libraries cached by 'ldconfig'.  Due to this, I 
changed GraphicsMagick so that it only links its Perl module at 'make 
install' and time it uses the already-installed shared library.  This 
still does not help with DESTDIR installs.


After I made this change, some GNU/Linux distributions stopped 
building the GraphicsMagick Perl module at all due to claiming that 
the build is broken.  They expect that everything builds in tree in 
one step and that 'make check' works as one would expect.  I don't 
think that this is possible to accomplish in a portable way using 
ExtUtils::MakeMaker.


If there is a viable solution which avoids ExtUtils::MakeMaker and 
allows Automake to drive behavior, with linkage managed by libtool, 
then I am all ears.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Fwd: installation of automake, autoconf, m4 error

2015-06-30 Thread Bob Friesenhahn

On Tue, 30 Jun 2015, Krishnan Rajiyah wrote:


However, the problem I had with m4 make was not with perl, it said that
automake-1.14 was missing from the system. This does not make sense since
that would mean that there would be a cycle in the dependency tree.


Most automake projects know how to automatically regenerate the 
autoconf/automake parts if a file has changed.  If the Makefile is 
trying to regenerate autotools parts, then a file has been edited or a 
file timestamp is incorrect.  If a network filesystem like NFS is 
involved, then make sure that the client and server agree on the time 
(such as by using NTP).  If you checked the files into a version 
control system and are using checked-out files, then it may be that 
the version control system did not preserve the time-stamps.



Can you please tell me what I am doing wrong? Is your software even
compatible with a NetBSD-4.0.1-x68k system?


I doubt that NetBSD is excluded.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: Integration of a Perl XS module with an Automake build system

2015-06-30 Thread Bob Friesenhahn

On Tue, 30 Jun 2015, Gavin Smith wrote:


as well as a heap of other variable definitions copied from the
MakeMaker Makefile. Now I have to see how ExtUtils:MakeMaker knew what
flags to use to see if I can replicate it.


These definitely come from how Perl was originally compiled and built. 
If a different compiler sas used, the provided options may be 
completely different and not work with the compiler you are currently 
using (even if the generated code would interoperate).


It would be good to find a way to tease these parameters out of the 
Perl installation.


Something else which changes is the identity of the target, which 
determines where build products go in the build tree and when 
installed.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



RE: What is a 'program' in Section 15.3.3.1?

2015-05-27 Thread Bob Friesenhahn

On Wed, 27 May 2015, Arthur Schwarz wrote:


TAP is supposed to be a custom test driver. Is the interface and manner of
calling different from other custom drivers, and the API and other comments
describing custom drivers in the manual?


I think that the method of calling is similar to other type of tests 
except that the list of tests 'TESTS' is a list of scripts which 
produce TAP output strings.



It depends on subordinate
scripts printing


I'm very confused here. It looks to me like tap-driver.sh is a standalone
script and doesn't need any help scripts. The input data is processed in awk
and all the needed functions are defined in this context.


If the test program was a C program so it was a binary and printed 
'ok' and 'not ok' outputs, and is able to find any input files without 
assistance, then no extra shim script should be required.


In my case I was replacing perhaps 1000 individual classic Automake 
test scripts and wanted to replace them with far fewer TAP scripts.



I am confused. Using Automake the Developer can generate a reference to a
class of test cases defined in the TESTS variable. Each one of the scripts
is required to output the results of one of more tests that they run in
quasi-TAP format. The TAP script, tap-driver.sh, takes the output and
generates a trs file. Included with the tap-driver.sh there is a means to
generate XFAIL and XPASS, however this seems to be global to all subtests
executed in a TESTS defined test. Each ok or not ok returned will be
translated (as required) to XPASS or XFAIL depending on
--expect-failure=yes.


A TAP test program/script may contain many tests and it may output 
multiple test lines.  The script indicates how many tests it plans to 
run by first printing a line like


1..532

to indicate that it plans to run 532 tests.  Each test gets is own 
PASS, FAIL, XFAIL indication.




As an example:
TEST_EXTENSIONS =.abc
ABC_LOG_DRIVER = tap-driver.sh
TESTS = test1.abc test2.abc

Means that tap-driver.sh is called twice, once for test1.abc, once for
test2.abc. Each of these tests can return one or more of ok, not ok, skip,
bail, or a YAML-like text string defined by --diagnostic-string and
--comments.


Yes, but tap-driver.sh interprets the output while the test 
program/scripts runs and it can handle an arbitary number of 
tests/results in the same test script.



Anyhow, thanks for the heads up, I will be looking at your files. And, as
always, I am confused.


I am not surprised. :-)

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: What is a 'program' in Section 15.3.3.1?

2015-05-27 Thread Bob Friesenhahn

On Wed, 27 May 2015, Arthur Schwarz wrote:


In looking at tap-driver.sh there doesn't appear to be a place where a
'program' is accepted on the input command line. It appears that after all
options are read if the input command line '$#' is not zero then an error is
declared. So, is the TAP interface different from other Custom Test Driver
interfaces?


The TAP interface is very different.  It depends on subordinate 
scripts printing 'ok' or 'not ok' (plus some optional extra diagnostic 
info) rather than return codes.  The subordinate scripts take full 
responsibility for the set of tests to be performed, what programs 
are invoked, and the options supplied to those programs.


For GraphicsMagick, I created a script which is sourced by actual test 
scripts (.tap scripts), which provides generic functions used by 
those scripts.  The functions support pass/fail detection based on 
program exit status, but also support 'xfail' by being passed a set of 
feature id strings which are matched which the actual available 
features.  If the test fails, but a necessary feature is missing, then 
an extra diagnosis is added which Automake's TAP script will turn into 
an informative XFAIL message:


   not ok # SKIP requires foo support

See the implementation at

http://hg.code.sf.net/p/graphicsmagick/code/file/b84df92f710a/scripts/tap-functions.shi

and

http://hg.code.sf.net/p/graphicsmagick/code/file/b84df92f710a/tests/common.shi

and

http://hg.code.sf.net/p/graphicsmagick/code/file/b84df92f710a/tests/rwfile.tap

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: how should package config files be installed

2015-03-31 Thread Bob Friesenhahn

On Tue, 31 Mar 2015, Andy Falanga (afalanga) wrote:

I want to distribute the correct *.pc files with the libraries I'm 
building.  While it's a simple matter to find the format of this 
file (by examining existing ones or online), I'd like to know if 
there's a preferred procedure for generating these files.  What 
should be done in order to make this work?


I am not sure what the correct procedure is, but I packages I maintain 
(and many others) use autoconf to configure a templated foo.pc.in 
and then install it using Automake.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



RE: Autmoake (test) editing

2015-03-31 Thread Bob Friesenhahn

On Tue, 31 Mar 2015, Arthur Schwarz wrote:


Editorial comments on TAP


I successfully used the TAP documentation to add TAP test support to 
my own project.


Did you intend to not include a patch to fix the issues you noticed?

I think that .trs files must be a project-specific test script 
extension.  All the TAP script needs to do is to print the expected 
number of 'ok' and 'not ok' messages.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



Re: converting to subdir-objects

2015-03-08 Thread Bob Friesenhahn

On Sun, 8 Mar 2015, Bert Wesarg wrote:


So if A/Makefile.am contains:

foo_SRCS = ... $(srcdir)/../foo/bar.c



As far as I am aware, it is wise/required that source paths be subordinate
to the Makefile.am location (no .. bits).


If that would really be a requirement now or with subdir-objects, I
would say its a regression.


I checked the current Automake manual and am not able to find any text 
which says that a subdirectory needs to be a subdirectory of where the 
Makefile resides.  What is a subdirectory anyway?  The generated 
Makefile would include per-source-file target specifications outside 
of the directory where the Makefile resides.


Perhaps the ability to use this successfully depends on the 
implementation of the VPATH and target resolution support in the make 
program (and possibly make program version) that your project is 
specifically targeting.


I can not imagine 'make dist' or 'make distcheck' working properly if 
source files are not subordinate to the directory containing Makefile. 
How is 'tar' supposed to work?


Harlan's posting here was due to Automake not working as he expected 
for files which are not in a subdirectory so maybe it does not work.


Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,http://www.GraphicsMagick.org/



  1   2   3   4   5   6   >