Re: automake variable prefix 'check_'

2021-02-02 Thread John Calcote
On Tue, Feb 2, 2021 at 11:46 AM Zack Weinberg  wrote:

> On Tue, Feb 2, 2021 at 1:24 PM DUDZIAK Krzysztof
>  wrote:
> > As one can't find string "distcheck" in GCS
>
> By GCS do you mean the GNU Coding Standards?
>
> > it looks like it wasn't GCS
> > which constitutes support and usage of `distcheck' target.
> > Maybe it is POSIX, or UNIX.
>
> As far as I know, the distcheck target was invented by automake. It
> probably should be added to the GNU Coding Standards, if it's not
> there already, but I don't know if anyone is working on that document
> anymore.
>
> > I wonder if recipe to build `distcheck' target
> > as compiled by automake gets same form as John Calcote
> > describes it in chapter 3 his Practitioner's Guide (2nd ed.).
>
> I have not read this book. Since it was published ten years ago,
> anything it describes may well be out of date.  I don't think
> distcheck has changed much in that time, but I could be wrong.
>
>
The second edition was published in 2019, so more like a year and a half
ago.
It's pretty up to date even with 2.70 as I tried to consider changes that
had been made in
the repository that had yet to be published.

John


Re: PLV, PSV in manual

2021-02-02 Thread John Calcote
Zack,

These are terms I made up when I wrote the No Starch Press Autotools book -
the manual had no existing name for these concepts, so I just contrived
names for them.

John

On Tue, Feb 2, 2021 at 11:39 AM Zack Weinberg  wrote:

> On Tue, Feb 2, 2021 at 1:19 PM DUDZIAK Krzysztof
>  wrote:
> > Isn't it that strange if for manual* following searches for a string
> result in no matches?
> > search for "PLV"
> > search for "PSV"
> > search for "Product List Variable"
> > search for "Product Source Variable"
>
> I've never heard these terms before and don't know what they mean. Can
> you please explain what these terms mean and what you expected the
> automake manual to have to say about them?
>
> zw
>
>


Re: Future plans for Autotools

2021-01-21 Thread John Calcote
Zack,

On Thu, Jan 21, 2021 at 9:12 AM Zack Weinberg  wrote:

> On Wed, Jan 20, 2021 at 5:15 PM Zack Weinberg  wrote:
> > Now we've all had a while to recover from the long-awaited Autoconf
> > 2.70 release, I'd like to start a conversation about where the
> > Autotools in general might be going in the future.
>
> > Now we've all had a while to recover from the long-awaited Autoconf
> > 2.70 release, I'd like to start a conversation about where the
> > Autotools in general might be going in the future.
>
> Thanks for all the great thoughtful responses that came in overnight!
> I see many people expressing similar sentiments, so rather than
> respond to each of you individually I'm going to summarize and reply
> to each strand in this one message.
>
>
I like the way you think. But, as your response so clearly indicates,
there's actually very little we can change in a radical way:

1. We shouldn't change the underlying language(s) and tools because one of
the key strengths of the Autotools is (their products') ability to run
everywhere without off-box dependencies. In my humble opinion, nothing the
GNU project has ever done comes close to the value they created by
designing the Autotools to work as seamlessly (for the end-user) as they
do. Download a tarball, unpack, ./configure && make. How much simpler could
it be? No additional tools to install - everything required is already
there - for the end user. Of course, developers may need to install a few
additional tools to build these tarballs from the base source repository,
but developers are not end-users and those are the "customers" GNU was
trying to appease. If you build from source repositories, you're
effectively a developer, not an end-user - not always the case, but 98% of
the time this is true.

Additionally, making such changes would require long man-hours of unpaid
effort - something the world is not willing to give. Gone are the days of
university grad students who know enough of what they're talking about to
effectively take on this task for the sake of their educational goals (this
is the way these tools started, after all). In fact, as much as they'd hate
to admit it, the GNU project has become much more corporate than
educational today. But let's not get bent out of shape by it - the truth of
the matter is that no one does anything for free (you yourself solicited
funding for the 2.70 update - which did NOT ultimately come from
philanthropic individuals.

2. The Autotools are actually more transparent than any other build tools
out there. All these other tools' (cmake, maven, etc) - that purport to be
so much simpler because they insulate the user from the underlying details
of the build process - these tool's primary failure is that this very
insulation keeps users from being able to make the changes they need to
accomplish their unique project-specific build goals.

Anyone who has nothing but good things to say about this aspect of cmake,
maven, gradle, or whatever, has simply not worked on a project that
requires them to move far enough away from the defaults. I've used them all
and I've spent hours in frustration trying to determine how to work around
the shortcomings of some "do-all" (except what I want) tool function. This
is simply not an issue with the Autotools. As someone mentioned earlier in
this thread, you can drop shell script into a configure.ac file, and make
script into a Makefile.am file. That is the very definition of
transparency. No other tool in existence allows this level of flexibility.

The most interesting feedback I read in the responses was that the
Autotools had an appalling lack of documentation which - ironically - is
actually not true. At the risk of sounding self-serving, I'll say this: in
the research I did for my book (Autotools, No Starch Press, 2019), I
primarily (95%) used the GNU Autotools documentation as reference sources.
The information is all there and actually very concise - too concise
sometimes. The problem with it is two fold.

First, the documentation for GNU projects is most often written by the
author of the code - that's the open source way, after all. This brings
with it all sorts of issues that most corporate documentation
(inadvertently) bypasses by standard corporate siloing protocol - that is,
they relegate doc writing to doc writers, who are often not experts in the
code, so they have to spend the time researching the product from a user's
perspective. This is exactly the perfect storm of events that (usually)
generates pretty good documentation.

Second, the documentation is written in a manner that is not conducive to
fast learning. There are two types of instructive material. The first is
reference material that, by its very nature, requires the user to read and
absorb significant amounts of information until they hit a critical mass of
understanding - an exponential learning curve that starts out long and slow
- at which point they avalanche into the steeper portion of the curve. 

Re: Stopping unit test on failed partial ordering dependency

2019-04-22 Thread John Calcote
On Mon, Apr 22, 2019 at 10:12 PM Kip Warner  wrote:

> How can I solve this problem?


Try using a stamp file only created on success. To do this you’ll need to
wrap your test calls in a script that creates the stamp file on success.

John

>


Re: Makefile.in, LIBTOOL and shared/static builds.

2018-06-23 Thread John Calcote
On Sat, Jun 23, 2018 at 1:00 AM or...@fredslev.dk  wrote:

> Hi,
>
> I am using the GNU libtool alternative slibtool which has some benefits
> such as a smaller code base, actively maintained and a huge performance
> boost.


I’m curious - it’s neat that slibtool exists, but if you need functionality
offered by libtool then why not just use libtool?

John

>


Re: [GSoC] Proposal for "Parse Makefile.am using an AST"

2018-03-04 Thread John Calcote
Hi Matthias,

If you have any suggestions on documents I can read or software I can check
> to
> prepare for this project I'll be glad to check them. I know texinfo is
> written
> in Perl and generates an AST so I'll check that.
>

A Makefile.am file is really just a Makefile with embellishments. It seems
like your ast would have to incorporate most of make’s syntax to work
correctly.

The reason Perl was chosen to begin with is because of its great text
processing capabilities as, ultimately, all automake really does is copy
the file directly to the output Makefile.in file, filtering out automake
stuff along the way and injecting make snippets generated from the automake
constructs.

This may not appear obvious at first because many simpler Makefile.am files
contain only automake stuff. But anything found in the Makefile.am file
that automake doesn’t recognize is assumed to be proper make script and
copied directly to the output file.

I suggest making your ast handle non automake chunks as a specific token
type designed to be passed through without modifications.

Just a few thoughts for you to consider.

Kind regards,

John Calcote


Re: What is minimum set of Automake work files needed for distribution on github?

2015-09-28 Thread John Calcote
Hi Robert,

The Autotools were created to meet a specific need - that of the open
source distribution model supported by many open source projects where,
occasionally - or perhaps nightly, the project maintainers would release a
source tarball containing a configure script and Makefile.in files. As a
regular user, you'd want to just download a tarball, extract, and run
./configure && make.

However, as a potential contributor, you'd want the source repository so
you could create patches against the tip of a particular branch. So you'd
clone the source repository and use the Autotools to create a configure
script for yourself in your repository work area.

Thus, the usual technique is to commit the Autotools source files required
by your project, but to NOT commit a configure script. Anyone wanting to
clone your repository is expected to be "developer enough" to know how to
run "autoreconf -i" to create the configure script.

While you CAN commit a configure script, it generally causes more problems
than it solves, as you find yourself committing an updated configure script
for lots of little project changes. Regardless, many projects do this -
especially lately on projects hosted by github, mainly (I believe) because
github has defined a new trend in the open source world where "regular
users" tend to get the source from the repository rather than from a
tarball, more often than not these days.

Here's a resource you might find helpful:
http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool


John


On Mon, Sep 28, 2015 at 4:20 AM, Robert Parker  wrote:

> I need to meet the requirements of 2 sets  of users, the ordinary user who
> is only interested `./configure; make; make install` and the power users
> who want to start with `autoreconf`.
>
> So far google search on the topic has only increased my confusion.
>
> --
> The Bundys, Cliven, Ted and Al. Great guys to look up to.
>


RE: The right way to use standard variable in configure.ac

2015-04-02 Thread John Calcote
Did you try:

CPPFLAGS=$CPPFLAGS -DMACR...

?

John


Sent via the Samsung GALAXY S® 5, an ATT 4G LTE smartphone


 Original message 
From: Andy Falanga (afalanga) afala...@micron.com 
Date:04/02/2015  5:04 PM  (GMT-07:00) 
To: automake@gnu.org 
Subject: The right way to use standard variable in configure.ac 

Hi,

I placed the following in my configure.ac file:

CPPFLAGS=-DMACRO1 -DMACRO2

because I found that as an example on a webpage someplace.  I reviewed so many 
learning about the autotools that I don't recall which one now.  I did this 
because there were some preprocessor flags that I wanted to have common to all 
sub-makefiles.

I ran into a problem today, however, where I needed to add a -I directive to my 
CPPFLAGS in order to find the necessary headers.  Then, the problem stepped in. 
 Because I placed this line in configure.ac, it was transcribed, verbatim (as 
it should), into configure.  The net result: not matter what I placed on the 
command line was ignored.  For example, the following:

CPPFLAGS=-I/path/to/the/alternate/location ./configure --build=x86_64-linux 
--host=arm-poky-linux-gnueabi

The additional path wasn't being appended, it was being stomped.  Did I miss a 
macro?  How should this be done because, obviously, I've gotten it incorrect.

Thanks,
Andy


Re: Wrong link option given to g++

2015-03-12 Thread John Calcote

Hi Arthur,

Look carefully at your configure.ac file or any scripts calling your 
makefile. See if you find anyone setting CFLAGS, CPPFLAGS, etc on the 
way in. They maybe adding -c (erroneously) to these variables.


To answer your question, the TEST macro should be completely independent 
of the check programs being built - it merely specifies which 
scripts/programs to run when tests are actually executed.


Regards,
John

On 3/12/2015 4:23 PM, Arthur Schwarz wrote:

Win7-64 bit
Cygwin 64-bit
g++ 4.9.2

I'm trying to link test program. The linker command option given to g++
during 'make check' says don't link. Any way around this?


check_PROGRAMS = test
test_INCLUDE   = -I$(top_srcdir)/src
test_SOURCES   = $(testCPP) $(testHead)
test_LDADD = libslip.a

'make check' contains:
g++ -std=gnu++11 -Wall -Wno-reorder -Wno-unused-value -Wno-address
-Wno-sequence-point -Wmaybe-uninitialized -c -g  -o test.exe Test.o
TestErrors.o TestGlobal.o TestHeader.o TestIO.o TestMisc.o TestOperators.o
TestReader.o TestReplace.o TestSequencer.o TestUtilities.o  libslip.a

which works when the -c option is removed.

g++ --help  =  -c  Compile and assemble, but do not link


Is the linker command supposed to be given in a test script in the
Makefile.am file (TEST=script)?


The failure of the past is the challenge of the present and the success of
the future.








Re: problem with subdir-objects and not found .Plo files when migrating to 1.14

2013-09-01 Thread John Calcote
Sergey,

I'm curious as to why it's important to you that build products not land in
the source tree, especially in light of the fact that you're clearly aware
of automake's support for out-of-tree builds. Out-of-tree builds exist to
solve the very problem you're trying so hard to fix.

Be aware that you're kicking against the pricks (as the old saying goes).
Sooner or later you'll run into other issues with new versions of automake
that may not have such simple resolutions.

Regards,
John
On Sep 1, 2013 11:53 AM, Sergey Jin' Bostandzhyan j...@mediatomb.cc
wrote:

 Hi,

 OK, never mind, problem solved. It seems that $(top_srcdir) simply did not
 expand anymore in _SOURCES. Keeping my structure with the build/Makefile.am
 but replacing $(top_srcdir) with '..' did the trick, it works
 like a charm now, including in and out of tree builds.

 No more warnings, no more not found .Po files, and I get my binaries and
 libraries nicely in the build directory without polluting the source tree.

 Kind regards,
 Jin

 On Sun, Sep 01, 2013 at 06:45:32PM +0200, Sergey 'Jin' Bostandzhyan wrote:
  Hi,
 
  thanks for your reply, some more questions though:
 
  On Sun, Sep 01, 2013 at 03:08:37PM +0100, Diego Elio Pettenò wrote:
   Is it possible to keep the logic with the in-tree build directory
 with
   automake 1.14? I did try to move all the logic from
 build/Makefile.am into
   the top level Makefile.am and removing build/Makefile.am
 completely, but
   it does not help - .Plo files are not found.
  
  
   I'd say it's a very bad idea to use that build/Makefile.am.
 
  Could you please elaborate? I'd be interested in the technical details
 on why
  it is a bad idea?
 
   Move the includes on the top-level Makefile.am, and get rid of
 $(top_srcdir) on
   all the _SOURCES declaration and it should work fine.
 
  It does compile now, and it does dump all the .o and .lo and what not
  in the same directory as the sources - very ugly. This is exactly what I
 was
  avoiding with the separate build directory and it worked just perfectly
  until automake 1.14 came along.
 
  Is there any way to tell 1.14 to place the object files into some
 dedicated
  directory without doing an actual out of tree build, or in other words,
  can I achieve the same setup that I had on 1.14 somehow?
 
  Kind regards,
  Jin
 




Re: problem with subdir-objects and not found .Plo files when migrating to 1.14

2013-09-01 Thread John Calcote
Don't get me wrong - I have nothing against your approach. And the automake
maintainers are certainly not purposely trying to make things more
difficult for you. I merely suggest that you may run into more issues down
the road simply because supporting your setup is not a current goal of the
tool.

Indeed, I've found these guys to be quite amenable to adding new build
paradigms to automake's repertoire.

I'm glad you found a solution that works.
On Sep 1, 2013 3:36 PM, Sergey 'Jin' Bostandzhyan j...@mediatomb.cc
wrote:

 John,

 On Sun, Sep 01, 2013 at 03:11:11PM -0600, John Calcote wrote:
  I'm curious as to why it's important to you that build products not land
 in the
  source tree, especially in light of the fact that you're clearly aware of
  automake's support for out-of-tree builds. Out-of-tree builds exist to
 solve
  the very problem you're trying so hard to fix.

 well, consider the following: your project has several source
 subdirectories,
 some of them with two levels. Even with out of tree builts you end up
 having the produced libraries and executables in each of those
 subdirectories
 respectively, or in other words: all over the place. Sure, you can do a
 make install to get all things together, but that's not always practical
 during development.

 My setup dumps all the compiled stuff into one directoriy, which makes it
 really easy to find, it's just more convenient.

 Honestly, if you have a choice, do you really prefer having the binaries
 all
 in different places in your tree?

 Also, I don't have to go out of the project dir when I want to make
 which I would have to do if I configured out of tree.

 What's wrong with that approach? People who use my setup seem to like it,
 as I said, it's convenience, no matter if used with in or out of tree
 builds.

  Be aware that you're kicking against the pricks (as the old saying goes).
  Sooner or later you'll run into other issues with new versions of
 automake that
  may not have such simple resolutions.

 I wonder why the authors of automake would try to restrict different and
 actually valid usage scenarios? I've been using this setup for over 5 years
 in different projects, I'd be really disappointed if I had to switch to a
 setup that is much more inconvenient for me.

 Please don't become another Gnome 3 by enforcing weird restrictions upon
 your
 users ;) Or is there really a hard technical limitation that would make
 setups as above impossible? I can't believe that... so I hope I will have
 the freedom of choice, also with newer versions of automake.

 Kind regards,
 Jin


  On Sep 1, 2013 11:53 AM, Sergey Jin' Bostandzhyan j...@mediatomb.cc
 wrote:
 
  Hi,
 
  OK, never mind, problem solved. It seems that $(top_srcdir) simply
 did not
  expand anymore in _SOURCES. Keeping my structure with the
 build/Makefile.am
  but replacing $(top_srcdir) with '..' did the trick, it works
  like a charm now, including in and out of tree builds.
 
  No more warnings, no more not found .Po files, and I get my binaries
 and
  libraries nicely in the build directory without polluting the source
 tree.
 
  Kind regards,
  Jin
 
  On Sun, Sep 01, 2013 at 06:45:32PM +0200, Sergey 'Jin' Bostandzhyan
 wrote:
   Hi,
  
   thanks for your reply, some more questions though:
  
   On Sun, Sep 01, 2013 at 03:08:37PM +0100, Diego Elio Petten  wrote:
Is it possible to keep the logic with the in-tree build
 directory
  with
automake 1.14? I did try to move all the logic from build/
  Makefile.am into
the top level Makefile.am and removing build/Makefile.am
  completely, but
it does not help - .Plo files are not found.
   
   
I'd say it's a very bad idea to use that build/Makefile.am.
  
   Could you please elaborate? I'd be interested in the technical
 details on
  why
   it is a bad idea?
  
Move the includes on the top-level Makefile.am, and get rid of $
  (top_srcdir) on
all the _SOURCES declaration and it should work fine.
  
   It does compile now, and it does dump all the .o and .lo and what
 not
   in the same directory as the sources - very ugly. This is exactly
 what I
  was
   avoiding with the separate build directory and it worked just
 perfectly
   until automake 1.14 came along.
  
   Is there any way to tell 1.14 to place the object files into some
  dedicated
   directory without doing an actual out of tree build, or in other
 words,
   can I achieve the same setup that I had on 1.14 somehow?
  
   Kind regards,
   Jin
  
 
 



RE: Using convenience libraries with non-recursive make

2012-08-15 Thread John Calcote
Hi Del,

First, if you're building a pure convenience library (you don't want shared
objects), then save yourself a layer of complexity by removing libtool from
the equation - just use LIBRARIES instead of LTLIBRARIES in your Makefile.am
file. Second, make sure all your relative paths are correct - that's often
the problem with errors like this.

Regards,
John

 -Original Message-
 From: automake-bounces+john.calcote=gmail@gnu.org
 [mailto:automake-bounces+john.calcote=gmail@gnu.org] On Behalf Of
 Del Merritt
 Sent: Wednesday, August 15, 2012 9:27 AM
 To: automake@gnu.org
 Subject: Using convenience libraries with non-recursive make
 
 I'm using automake 1.11.1 and autoconf 2.68.  I am switching from a set of
 hand-written Makefiles to autoconf/automake.  I'm switching to autotools
 since the support for cross-compilation is already there; my hand-written
 Makefiles are getting hard to manage, and they don't support VPATH builds
 cleanly.
 
 I have a lot of source files (4K+) and a lot of libraries (40+).  My goal
is to
 generate a single (typically/initially static) library and an executable
that
 demos/drives it.  I am hoping to avoid a recursive make (SUBDIRS=...),
since I
 am holding to the Recursive makefiles considered harmful mantra.
 
 A representative Makefile.am for my project is:
 
 # Automake rules to build application.
 AM_CXXFLAGS = -I${includedir}
 ACLOCAL_AMFLAGS = -I m4
 bin_PROGRAMS = myprog
 myprog_SOURCES = b/c/d/myprog__main.cpp
 myprog_LDADD = libmyprog.la
 lib_LTLIBRARIES = libmyprog.la
 nodist_EXTRA_libmyprog_la_SOURCES = dummy.cxx
 
 lib_LTLIBRARIES += liba_s1.la libb_s2.la libb_c_s3.la
libb_c_d_myprog.la
 
 libmyprog_la_LIBADD =  liba_s1.la libb_s2.la libb_c_s3.la
 libb_c_d_myprog.la
 liba_s1_la_SOURCES = a/s1.cpp
 libb_s2_la_SOURCES = b/s2.cpp
 libb_c_s3_la_SOURCES = b/c/s3.cpp
 libb_c_d_myprog_la_SOURCES = b/c/d/myprog.cpp
 
 And it's similarly-simple configure.ac:
 
 AC_PREREQ([2.59])
 AC_INIT([libmyprog], [1.0], [d...@alum.mit.edu])
 AM_INIT_AUTOMAKE([foreign subdir-objects])
 AC_CONFIG_SRCDIR([a/s1.cpp])
 AC_CONFIG_HEADERS([libmyprogconfig.h])
 AC_CONFIG_MACRO_DIR([m4])
 AC_PROG_MKDIR_P
 AC_PROG_CXX
 AM_PROG_LIBTOOL
 AC_CONFIG_FILES([Makefile])
 AC_OUTPUT
 
 The directory structure is similar to:
 
 ./a/s1.cpp
 ./b/s2.cpp
 ./b/c/d/myprog.cpp
 ./b/c/d/myprog__main.cpp
 ./b/c/s3.cpp
 
 and in my real project there's lots more source in each subdirectory, and
lots
 more nested subdirectories.  Yes, the source containing main() is down
deep
 in the structure; myprog.cpp instances some top-level classes, and
 myprog_main.cpp is just the main() that gets things rolling.
 
 When ./configure and make, I get:
 
 del@oyster ~/am $ make
 make  all-am
 make[1]: Entering directory `/home/del/am'
 make[1]: *** No rule to make target `libmyprog.lo', needed by
 `libmyprog.la'.  Stop.
 make[1]: Leaving directory `/home/del/am'
 make: *** [all] Error 2
 
 With the legacy hand-written makefile my project builds just fine.  I'm
 looking for suggestions as to what I'm missing in my Makefile.am.  Note
that I
 can explicitly say:
 
 $ make libb_c_s3.la
 
 and I lo, that library's source(s) compile and link.  So part of the
generated
 Makefile is cool.
 
 Thanks,
 -Del





RE: Could automake-generated Makefiles required GNU make? (was: Re: [gnu-prog-discuss] portability)

2011-11-25 Thread John Calcote
  Rather, one GNU package could drop support for ordinary Make, and
see
  how users react.  If the level of complaint is not too high, then
 
 GCC dropped support for non-GNU make in version 3.4 (April 2004).
 
 We could see how users reacted to that.

Forgive my ignorance (because I'm *sure* I'm missing something crucial
here), but I don't believe this is a fair comparison, since GCC's use of
make doesn't appear (to me) to be in the same order as Automakes use of
make.

Regards,
John




RE: bug#9088: Java support

2011-07-18 Thread John Calcote
Jack,

-Original Message-
From: automake-bounces+john.calcote=gmail@gnu.org
[mailto:automake-bounces+john.calcote=gmail@gnu.org] On Behalf Of Jack
Kelly
Sent: Monday, July 18, 2011 1:34 AM
To: Ralf Wildenhues
Cc: 9...@debbugs.gnu.org; automake@gnu.org
Subject: Re: bug#9088: Java support

On Mon, Jul 18, 2011 at 4:17 PM, Ralf Wildenhues ralf.wildenh...@gmx.de
wrote:
 * Jack Kelly wrote on Sat, Jul 16, 2011 at 06:13:58AM CEST:
 On Sat, Jul 16, 2011 at 9:55 AM, tsuna wrote:
  On Fri, Jul 15, 2011 at 1:58 AM, Stefano Lattarini wrote:
  As my java foo is pretty weak, I'm not sure how to handle jar 
  manifests, jar entry points, or other jar/javac subtleties and
advanced features.
  Suggestions welcome.
 
  You can create the manifest manually fairly easily.  Here's an 
  example in the project I'm in the process of autotoolizing:
  https://github.com/stumbleupon/opentsdb/blob/6059488f38fc8a51d426d6
  972eee6fdd1033d851/Makefile#L207

 Perhaps there should be support for a foo_jar_JARADD, that by analogy 
 to _LDADD, that specifies additional files to be included in the jar?

 Why would it have to be a new primary, instead of just reusing _LDADD?

Because, IMO, it's conceptually different. The output's being assembled with
`jar', not `ld'.

Actually...conceptually, a jar is identical to a library identical: A
library is an archive of objects. A jar is an archive of objects. Jar's
happen to be compressed as well, but that's irrelevant. Conceptually,
they're the same.

I would argue in favor of different names for political reasons. :) There's
still a fairly large rift between C/C++ and Java developers. 

--john




Re: GSoC project idea: non-recursive automake project

2011-03-20 Thread John Calcote
On 03/19/2011 01:45 PM, Harlan Stenn wrote:
 Pippijn wrote:

 On Fri, Mar 18, 2011 at 05:26:58PM -0700, Harlan Stenn wrote:
 If there was a student interested in showing how easy it was to use
 automake to do non-recursive Makefiles for a project, I'd be willing to
 co-mentor and work with them to convert NTP to that sort of operation.
 It's mostly trivial. How hard are GSoC projects supposed to be?
 I'll assume you have seen my reply to Ralf.

 From my POV, I have heard folks saying for a long time how easy it is
 to use automake to produce non-recursive Makefiles.  But I haven't seen
 this in practice, and on the (few) attempts I have made to figure it out
 myself and look for examples, I have not yet been able to find a really
 useful solution.

 What I think we'd want is a reasonably well-documented description of
 how to use automake to produce a source tree where one can:

 - run make from the top-level of the tree and all of the normal things
   happen (and all of the normal targets work)
 - run make from a subdir, which would handle all of the normal targets
   for that subdir, and would also automatically handle *all* of the
   dependencies needed for the specified targets in that subdir (like
   prerequisite libraries).

I'd be *very* interested to see how this second item is done. One of the
inherent benefits of recursive make is that there's a self-contained
Makefile in each directory. Thus, you can run make from that directory.
I'm wondering how you do that with only one top-level Makefile.

--john



Re: dynamic executables for check_PROGRAMS?

2011-02-20 Thread John Calcote
Hi Jeff,

On 02/17/2011 04:06 PM, Daily, Jeff A wrote:
 I wrote a profiling layer for my library utilizing weak symbols.  I thought 
 for starters it would be nice to profile some of my test programs, to make 
 sure things are working okay.  I'm using autoconf, automake, and libtool, so 
 I configured using --enable-shared --disable-static, however, my test 
 programs are not created as dynamic executables.  If I change my 
 check_PROGRAMS to bin_PROGRAMS, they are dynamic executables.  But, I can't 
 do this in production for obvious reasons.

 So, is there any way to create dynamic executables for my check_PROGRAMS?

 Jeff

The --enable/disable-static/shared flags apply only to building shared
or static libraries within your project. These flags don't have anything
to do with how executables are built except to limit or expand the
options available to programs built against internal libraries. To test
this assertion, create a simple project without libtool (don't call
LT_INIT in configure.ac). When you run ./configure --help, you'll see
that these options are missing entirely from the help display.

I assume when you say dynamic executables you're referring to test
programs built to use shared libraries created from within the same
project. If you're noticing that your test binaries are getting linked
against your static libraries, then something strange is happening in
your project: If you're using --disable-static and it's working proper
but your check_PROGRAMS are inclined to link with the (now non-existent)
static libraries then I have to ask: How are these static libraries
getting built? - After all, you disabled them.

Assuming you've got Makefile.am code like this:

   check_PROGRAMS = test1
   test_SOURCES = test1.c
   test_LDADD = ../mylib/mylib.la

Make sure test_LDADD is referring to the correct relative path to your
internally built .la file. If you're specifying mylib.a (or even
mylib.la) without a relative path, you may be picking up a static
version of your library from your environment (/usr/lib or /usr/local/lib).

My assumptions may all be wrong - especially in light of the fact that
bin_PROGRAMS seems to work the way you want it to...

John



Re: reword documentation about symbol stripping

2010-11-21 Thread John Calcote
You need to remember the original target audience of GNU software was a
group of people that wanted to share free software. Most of them were
students or researchers that generally built software distributed in
source form. Only in the last 10 years has Linux become generally
popular. Before that time, it and all the software that ran on it were
pretty much relegated to programmers. That being the case, users were
programmers, and programmers are indeed helpless without debug symbols
during a crash - that is, unless you're one of those rare types that
loves to dig into a good assembly debug session.

In any case, it makes complete sense why the GNU standards were written
this way when you understand the history.

John

On 11/21/2010 12:25 PM, MK wrote:
 On Sun, 21 Nov 2010 17:44:10 +0100
 Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 Oh well.  This thread has been so noisy and unproductive, maybe we
 should seize the opportunity to take a bit of good away from it.

 Karl, what do you think about this rewording (against the gnulib copy
 of make-stds.texi) that makes the text slightly less subjective and
 slightly less tongue-in-cheek?
 Wow-wee is that refreshing gang, thanks.  I do recognize that I could
 have done more of my own homework here, but: as a neophyte programmer,
 that is endlessly true (of an endless array of topics -- I think
 otherwise known as an infinite regress), and it is always nice to find
 something spelled out in a clear, concise manner. Then I can move on
 quickly to the next conundrum, rather than having to investigate some
 vague insinuation at every step, potentially wasting other people's
 time in the process.

 May we have a real name please to credit in the ChangeLog entry?
 I would be Mark T. Eriksen.





Re: Makefile to Makefile.am

2010-08-16 Thread John Calcote
 On 8/16/2010 9:06 AM, Bob Friesenhahn wrote:
 On Sun, 15 Aug 2010, John Calcote wrote:

 The warning you're seeing is harmless enough on platforms that support
 GNU make. The purpose of the warning is to let you know that your users
 will not be able to build your project on systems that support the
 Autotools, but do not support GNU make (not many these days).

 While GNU make may be 'supported' on a wide variety of systems, that
 does not mean that it is the default 'make' program on a system, or
 available on the system by default.  The user may need to do something
 special in order to install and invoke GNU make.

 If depending on GNU make was considered ok, then Automake would have
 been developed quite differently than it is.  Given current Automake
 objectives, it is wise that individual projects also try to avoid GNU
 make syntax in Makefile.am.

Excellent point Bob.

John




Re: Makefile to Makefile.am

2010-08-15 Thread John Calcote
 On 8/14/2010 7:09 PM, samson.pierre wrote:

 Yes it works :-)

 But I see a little warning when I call autoreconf :
 `%'-style pattern rules are a GNU make extension

 I think it is because I use this character ‘%’ in my rules. But this ‘%’ is 
 very interesting to define an implicit rules.

 Is there an equivalent or anything else which can help me to write this rule 
 avoiding this warning message?

The use of % in rules is indeed a GNU make extension. This is called a
pattern rule. Pattern rules in GNU make should be used in place of older
style suffix rules which use asterisk (*) to define the stem of a
wildcard pattern. The difference is that pattern rule patterns can
define general file matching patterns (e.g., a%de.o) , whereas suffix
rules can only define patterns that match files with a particular
extension (e.g., *.txt).

The warning you're seeing is harmless enough on platforms that support
GNU make. The purpose of the warning is to let you know that your users
will not be able to build your project on systems that support the
Autotools, but do not support GNU make (not many these days).

Regards,
John



Re: conditionals in Makefile.am

2010-06-30 Thread John Calcote
On 6/30/2010 3:41 AM, Wesley Smith wrote:
 From the automake manual:
 
 You may only test a single variable in an if statement, possibly
 negated using ‘!’. The else statement may be omitted. Conditionals may
 be nested to any depth. You may specify an argument to else in which
 case it must be the negation of the condition used for the current if.
 Similarly you may specify the condition that is closed on the endif
 line:

  if DEBUG
  DBG = debug
  else !DEBUG
  DBG =
  endif !DEBUG


 What's the purpose of specifying the condition that is closed?  I've
 never seen this kind of construct before.  Is it a substitute for
 elseif?
   

Documentation. There may be several dozen lines of code between the if
and the else. A reader may be wondering... else what?

John



Re: performing pre-build shell commands with automake

2010-06-20 Thread John Calcote
Wes,

On 6/20/2010 2:14 PM, Wesley Smith wrote:
 How does one do this kind of thing when the source file is specified
 in a subfolder?
 INCLUDES = -I/usr/include/lua5.1 -I/usr/include/cairo
 -I/usr/include/directfb -I/usr/include/freetype2

 lib_LTLIBRARIES = cairo.la
 cairo_la_LDFLAGS = -module -avoid-version
 cairo_la_LIBADD = -llua5.1 -L/usr/lib -lcairo -L/usr/lib -ldirectfb
 -L/usr/lib -lfreetype -L/usr/lib
 cairo_la_SOURCES = src/lcairo.c

 resource.qt:
   touch TESTING

 lcairo.o: resource.qt



 lcairo.o never gets triggered here.  If I explicitly do make lcairo.o
 then it will get triggered, but I'm not sure from scanning the
 Makefile and Makefile.in how it would get implicitly triggered.
   

Build once without your qt dependencies in place and carefully note the
name and relative location of the object file generated from
src/lcairo.c. The qt dependency rule will have to look like this:

exact-name-and-relative-location-of-object : resource.qt

This is the short answer. You might also want to use the $(OBJEXT) macro
for the extension on the object file for portability's sake:

.../lcairo.$(OBJEXT) : resource.qt

This is why most people just use BUILT_SOURCES - it's cheating but it
works for most cases. (See section 9.5 of the Automake manual for more
info on BUILT_SOURCES).

John




Re: performing pre-build shell commands with automake

2010-06-20 Thread John Calcote
On 6/20/2010 2:34 PM, Wesley Smith wrote:
 I also tried using lcairo.lo, which triggered the preliminary shell
 commands but barfed because the commands didn't generate lcairo.lo.
 hmm..  Is there a way to print out what rules were invoked during a
 make invocation?
   

You do want the actual object (.o on most platforms) file - the .lo file
is a text file containing libtool meta-information text about the
object. There are some make debugging aids involving dumping the rule
database, but you'll have better luck by carefully analyzing the output
lines during a build - they'll tell you what object files are being
generated from your sources and where they're being put.

John



Re: performing pre-build shell commands with automake

2010-06-20 Thread John Calcote
On 6/20/2010 3:05 PM, Wesley Smith wrote:
 libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../..
 -I/usr/include/lua5.1 -I/usr/include/cairo -I/usr/include/directfb
 -I/usr/include/freetype2 -g -O2 -MT lcairo.lo -MD -MP -MF
 .deps/lcairo.Tpo -c src/lcairo.c  -fPIC -DPIC -o .libs/lcairo.o

 libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I../..
 -I/usr/include/lua5.1 -I/usr/include/cairo -I/usr/include/directfb
 -I/usr/include/freetype2 -g -O2 -MT lcairo.lo -MD -MP -MF
 .deps/lcairo.Tpo -c src/lcairo.c -o lcairo.o /dev/null 21
   
 There are 2 issues I see:
 1) Why would the Makefile.am below produce 2 commands compiling the
 same source file into 2 different directories?
   

The two libtool commands are not exactly alike. The first command
generates position-independent code (PIC) to be used in your libtool
shared library. That's why the object file is placed in the .libs
directory. The second line doesn't use the -fPIC and -DPIC options. This
version of the object file is used in the traditional archive (static
library). The double compile comes from the use of Libtool. However, if
you don't use Libtool, you'll only get static libraries.

 2) One of those builds commands actually puts lcairo.o in the same
 directory as the Makefile, so I would assume that the rule lcairo.o
 would correspond to this step.
   

Automake is a little too smart for us here. It won't generate a rule to
build a libtool object if you've already written one - even if it's only
a dependency rule (no commands). To fix this, place the dependency on
the source file, not on the object:

src/lcairo.c : resource.qt

I tested this and it works - here's my test code:

lib_LTLIBRARIES = libfoo.la

libfoo_la_SOURCES = src/foo.c

resource.qt:
touch TESTING

src/foo.c: resource.qt

And here's the result of running make in the Makefile.am directory:

$ make
touch TESTING
/bin/sh ../libtool  --tag=CC   --mode=compile gcc -DPACKAGE_NAME=\foo\
-DPACKAGE_TARNAME=\foo\ -DPACKAGE_VERSION=\1.0\
-DPACKAGE_STRING=\foo\ 1.0\ -DPACKAGE_BUGREPORT=\\
-DPACKAGE_URL=\\ -DPACKAGE=\foo\ -DVERSION=\1.0\ -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1
-DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1
-DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1
-DLT_OBJDIR=\.libs/\ -I. -g -O2 -MT foo.lo -MD -MP -MF
.deps/foo.Tpo -c -o foo.lo `test -f 'src/foo.c' || echo './'`src/foo.c
libtool: compile:  gcc -DPACKAGE_NAME=\foo\ -DPACKAGE_TARNAME=\foo\
-DPACKAGE_VERSION=\1.0\ -DPACKAGE_STRING=\foo 1.0\
-DPACKAGE_BUGREPORT=\\ -DPACKAGE_URL=\\ -DPACKAGE=\foo\
-DVERSION=\1.0\ -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1
-DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1
-DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1
-DLT_OBJDIR=\.libs/\ -I. -g -O2 -MT foo.lo -MD -MP -MF .deps/foo.Tpo
-c src/foo.c  -fPIC -DPIC -o .libs/foo.o
libtool: compile:  gcc -DPACKAGE_NAME=\foo\ -DPACKAGE_TARNAME=\foo\
-DPACKAGE_VERSION=\1.0\ -DPACKAGE_STRING=\foo 1.0\
-DPACKAGE_BUGREPORT=\\ -DPACKAGE_URL=\\ -DPACKAGE=\foo\
-DVERSION=\1.0\ -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1
-DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1
-DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1
-DLT_OBJDIR=\.libs/\ -I. -g -O2 -MT foo.lo -MD -MP -MF .deps/foo.Tpo
-c src/foo.c -o foo.o /dev/null 21
mv -f .deps/foo.Tpo .deps/foo.Plo
/bin/sh ../libtool --tag=CC   --mode=link gcc  -g -O2   -o libfoo.la
-rpath /usr/local/lib foo.lo
libtool: link: gcc -shared  .libs/foo.o  -Wl,-soname -Wl,libfoo.so.0
-o .libs/libfoo.so.0.0.0
libtool: link: (cd .libs  rm -f libfoo.so.0  ln -s
libfoo.so.0.0.0 libfoo.so.0)
libtool: link: (cd .libs  rm -f libfoo.so  ln -s
libfoo.so.0.0.0 libfoo.so)
libtool: link: ar cru .libs/libfoo.a  foo.o
libtool: link: ranlib .libs/libfoo.a
libtool: link: ( cd .libs  rm -f libfoo.la  ln -s ../libfoo.la
libfoo.la )
$

I don't know how QT resources are used by source code, but this may be
the more correct way to place the dependency anyway. If QT resources are
somehow included in the c sources, then this dependency is more
accurate. If they're just linked into the final library, then the
dependency should be placed between the library and the resource.

John



Re: performing pre-build shell commands with automake

2010-06-20 Thread John Calcote
On 6/20/2010 4:48 PM, Wesley Smith wrote:

 src/lcairo.c : resource.qt

 I tested this and it works - here's my test code:
 
 It definitely does work!  Thanks so much.  The QT resources in my case
 or code generated files from the actual source files, so it makes
 sense to trigger the rules off of the sources themselves.  I really
 appreciate the help.
   

Don't forget to write proper clean rules to cleanup the generated
sources. And also don't forget to add your generated sources (header
files? - I'm not a QT expert so I'm not sure of the exact nature of your
generated sources) to a nodist_product_SOURCES variable so that make
dist doesn't distribute them.

To test the clean rules to ensure you've cleaned up everything you
generated, run make distcheck. This will build and test a distribution
package, along with the cleanup. If the temporary directory contains
anything other than the dist package's sources after make clean is run
by the test, you'll get a dist error.

John



Re: nodist_BUILT_SOURCES?

2010-06-10 Thread John Calcote
Hi Monty,

On 6/10/2010 11:42 AM, Monty Taylor wrote:
 Hey all,

 Potentially odd question...

 How would I accomplish something like what's in the subject? I have a
 source file that wants to be built before other files - so including it
 in BUILT_SOURCES does the right thing, but I do _not_ want to have it
 included in the dist tarball.

   

Files listed in BUILT_SOURCES are not taken into account wrt
distribution. The distribution list is built from other primary-like
constructs, such as *_SOURCES. The example in the autoconf manual shows
how you would do what you want:

...
nodist_foo_SOURCES = bindir.h
BUILT_SOURCES = bindir.h
...

In this example, BUILT_SOURCES is used only to get bindir.h built up
front. The actual SOURCES variable used to list its use by a particular
product carries the nodist prefix, which keeps it from being distributed.
I tried removing it from BUILT_SOURCES and adding in a rule that looks like:

 %.cc: drizzled/configmake.h

 But that didn't work.

 Any thoughts?

 While we're at it - this whole thing is to get expanded values of
 autoconf directories into a header file where I can consume them...
 which because they contain nested variables
 (localstatedir=${prefix}/var}) I seemingly have to do at make time. The
 dist problem above could be solved if anybody knows a decent trick to
 fully expand those variables at configure time... I've tried many
 combinations of eval and quoting - but nothing seems to do what I'm
 trying to do.
   

You can't (or rather shouldn't) fully expand these variables at
configure time because they may be modified at make time by the user on
the make command line (e.g., make prefix=xxx). There are two
widely-practiced options:

1. Replace such variables on the compiler command line for all or some
sources:

mylib_la_CPPFLAGS =\
 -DSYSCONFDIR=\$(sysconfdir)\\
 -DSYSLOGDIR=\$(syslogdir)\ ...

This works well for situations where you only have a few variables to
replace.

2. Add some custom rules to your Makefile.am scripts that build your
source files using variable replacement techniques like those used by
Autoconf:

EXTRA_DIST = myprog.cfg.in

edit = sed \
   -e 's|@sysconfd...@]|$(sysconfdir)|g' \
   -e 's|@sysrund...@]|$(sysrundir)|g' \
   -e 's|@syslogd...@]|$(syslogdir)|g' \
   -e 's|@libexecd...@]|$(libexecdir)|g' \
   -e 's|@sbind...@]|$(sbindir)|g'\
   -e 's|@pref...@]|$(prefix)|g'

all: myprog.cfg

# Build executable scripts
myprog.cfg : Makefile
$(edit) '$(srcdir)/$...@.in'  $@

Then just format your input templates just like autoconf input templates
with @variable@ where ever you want variable replacement to occur at
make time.

Regards,
John




Regarding the JAVA primary

2010-04-19 Thread John Calcote

Hi Ralf,

I've been thinking a lot about the JAVA primary lately. It turns out 
that Automake's handling of Java sources is pretty efficient. 
Experiments indicate that building ~500 Java source files in a single 
command takes about 15 seconds on a 1.8 GHz CPU with 512 MB RAM. That 
same set of 500 sources, built individually can take upwards of 500 
seconds - a 40 times increase in compile time. (GNU Make, O'Reilly)


In other words, it's simply better to not manage individual 
source/object dependencies in Java. You win the battle, so to speak -- 
but you lose the war.


Since the current implementation of the JAVA primary is not managing 
individual source/object dependencies (something that's difficult to do 
anyway because of inner and anonymous class definitions), would it not 
be prudent to remove the restriction regarding needing to specify all 
source files individually in Makefile.am -- at least for the JAVA primary?


Builds in the Java world generally specify source files found within a 
subtree using a globbing mechanism, with optionally specified inclusions 
and exclusions. And the layout of that subtree defines the packages to 
which classes belong. Would it not be fair to say that all files found 
matching the source specification within a specified subtree are 
distributed within the directory layout to which they belong? In other 
words, distribution (for Java sources only) would include the same set 
of files specified in the Java source globbing pattern. Here's an example:


sources = src/**/*.java
exclude = src/examples/*.java

Distributed Java sources would include everything defined by /sources/. 
Excluded files would be excluded from the build, but not from distribution.


I'm not stealing this concept entirely from ant. Most of the Java IDE's 
(NetBeans, IntelliJ, Eclipse, etc) also use almost the same mechanism 
for specifying sources belonging to a particular build.


I recognize that this is a significant deviation from existing Autotools 
methodology, but I'm not sure we can make any real forward progress in 
Autotools Java builds without making a few such concessions.


A problem I foresee is providing the globbing functionality to makefile 
commands. We'd almost need a new auxiliary script (like install-sh) to 
generate lists of files from such glob specs. Not sure yet from where 
the primary functionality would come -- perhaps a java utility, so that 
the same level of portability would be available to java builds as the 
source that's being built. That is, if someone uses the JAVA primary, 
he/she can expect to be required to have additional build functionality 
available, in the form of a JVM and javac compiler. Just a thought.


One thing we can do at this point is to define JAR and WAR primaries 
that build and install (in appropriate locations), .jar and .war files. 
I've got a few ideas I'll try to codify and send out shortly.


John




Re: Regarding the JAVA primary

2010-04-19 Thread John Calcote

Hi Steffen,

On 4/19/2010 1:22 PM, Steffen Dettmer wrote:

On Mon, Apr 19, 2010 at 8:25 PM, John Calcotejohn.calc...@gmail.com  wrote:
  [...]
   

Builds in the Java world generally specify source files found
within a subtree using a globbing mechanism, with optionally
specified inclusions and exclusions.
 

Yes, they do. BTW, does anyone know why?

With some sarcasm someone could tell that it is done in this way
because with Java you need to make heaps of files (e.g. one for
every public exception), but maybe it has a good reason?

We use some very old and surely bad custom automake rules to
compile java sources. Ages ago we also had some wildcard (`find')
rules (inspired by ant assuming `this would be the good way to
go'). Those rules collected the files, but we changed them to use
a list of file names, which seemed much cleaner and solved some
issues (which of course could had been solved in other ways,
too).
   


Actually, the only thing bad about the current JAVA primary make rules 
is that the command line length may easily be exceeded with very large 
file sets (in fact, with hundreds of files, it surely will be exceeded 
on some platforms). But this problem can easily be fixed by dumping all 
of the source files into a temporary text file (filename), and then by 
using @filename on the javac command line.



One motivation for this change was that our make rules that
generated the exception source code files (many differing more or
less just in the name and some `String name ...') and people
forgot to add the files, so if existing for them they had been
built but not for others where the files were not updated (and
the new gensrc files were missing). This resulted in an
incomplete jar file and unfortunately when using this jar a
compilation error was flagged out in the wrong package... unit
tests did not helped because of course also missing in the
incomplete jars (and the make check scheme we used also used a
wildcard approach to collect the unit tests to be executed).
Strange things happend when switching between branches (where
typically the number and kind of exceptions changed etc).
   


Yes, Java does promote the use of many source files - not necessarily a 
bad thing, as this could be seen to promote modular design, if used 
correctly and wisely. I tend to use static inner classes to keep 
relevant things together and to reduce the number of source files. The 
nice thing about this approach is that it's generally fairly easy to see 
where a class should be defined at the outer scope (and thus in its own 
source file), or within another class.



I disliked that when by some problem almost all sources would get
lost in a sandbox (e.g. if switching to a bad / incomplete tag),
make (and even make check!) succeeded.

On `tree conflicts' (one changed a file, another moved it) it
could even happen to have two times the same functionality in a
jar...

To build a list of files why not open Makefile.am in $EDITOR,
like vim or emacs, and insert the file list here (in vim, you may
start a line `java_source = \' and on the next blank line use
`!!find . -name *.java -exec echo {} \\ ;'
which works except for the last file, which ends with a `\'.). No
need to make this at make run time, or is there any?

By this, cvs diff (or whichever SCM tool is used) easily shows
the included files which is easier to review I think.

Using wildcards IMHO means to logically `include' the directory
contents to the Makefile. I think you cannot even use it as
dependency (and thus I'd guess the jars would be resigned on each
make run, even if no file changed?).

What is the advantage of using find magic and make time?

How do you handle your java.in files?
How are you safe to get old generated files out the jar (if
removed from gensrc, they still are in builddir and the make
clean rule may not even take it any longer - how to notice?).

I'm afraid the find/wildcard approach only works for simple
builds - but those could also be done by ant I think...
   


All very good points - and all issues that Automake has been designed to 
overcome. Many folks don't like the requirement that all source files 
must be specified statically in Makefile.am. I'm not one of these 
people. I tend to agree with you. For Java, I think it's a matter of 
developer convenience that sources in a tree are specified with a 
globbing pattern. It's not just that there are a lot of files, but also 
that they tend to be spread around the source tree because they belong 
to various packages that are defined by their location within the source 
tree.


I can certainly see how we may want to stick with the Automake static 
source file specification rules for the reasons you point out. In this 
case, it becomes more of an evangelistic documentation issue. :) That 
is, we might be wise to add a chapter to the Automake manual describing 
the value that comes from Automake's position in this matter. Heaven 
knows, we've answered the question on the 

Re: Keeping source directory structure

2010-03-24 Thread John Calcote

On 3/24/2010 11:22 AM, Peter Johansson wrote:

Hi Brendon,

Brendon Costa wrote:

So I tried replacing the variable with:
XXX_SOURCES= \
   ../common/map.cpp \
   ../linux/map.cpp

This will kind of build, though the location it is putting the object
files in is really bad and conflicts with those from other makefiles.
I.e. they are going into:
build-dir/common/map.o
build-dir/linux/map.o
It is not clear what the conflict is here. Are you creating several 
map.o files from the same map.cc? why?


He's using a non-recursive build system that pulls source files of the 
same name from two different sub-directories. The resulting object files 
are named the same, and stored in the same directory, so there's a 
conflict. He's tried using the automake option to generate objects in 
the source directory, but that didn't work for reasons he outlined (but 
I wasn't clear on).


John




Re: Building prog first

2010-03-22 Thread John Calcote

On 3/22/2010 4:34 PM, Reuben Thomas wrote:

What about using a info browser to search through the manual?
   
I often do that. The trouble is that often what I want to know has to

be deduced from the manual, which is natural enough, because the
manual tends to be structured according to the structure of the
program it documents, rather than of the problems the user is trying
to solve. By using web searches I can often find people asking and
answering precisely the problem I'm trying to solve.
   


Reuben, you've just hit upon one of the two most significant problems 
with Javadoc and the like (including doxygen, man pages, and info pages):


1. You have to already know the API to know where to look for help on 
the API because the documentation is structured according to the API, 
rather than according to the top 100 use cases.


2. Most people don't add more than method header comments to their 
source code, which means there's often no concept documentation, just 
method documentation, which is useless to people trying to learn the 
API. This isn't always true. Some projects try hard to add concept docs 
too, but just very few by comparison.


Just a comment.

John




Re: Building prog first

2010-03-21 Thread John Calcote

Hi Russell,

On 3/21/2010 6:14 AM, Russell Shaw wrote:

I was limping along for years learning autoconf/make in bits until this
tutorial came out

  Autotools: a practitioner's guide to Autoconf, Automake and Libtool

http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool 



I realized a lot of useful things after that. The main thing that makes
it easy is that a real project is stepped through with lots of side 
discussions,
and high-level overviews put things in to perspective. I'd really like 
to have

a hard-copy book of that tutorial.


Thanks very much for the positive feedback. A much enhanced (and 
somewhat corrected) version of the book is scheduled to be published in 
May 2010 by No Starch Press:


   
http://www.amazon.com/Autotools-Practioners-Autoconf-Automake-Libtool/dp/1593272065


Best regards,
John



After that, i could understand the autoconf manual. I was on dos/windows
up to nearly yr2000 or so, so i had to learn unix programming, shell
programming, make-file programming, m4, how unix processes work etc,
to be able to look in generated Makefiles and configure and see from
that what errors i was making in configure.ac and automake.am.
Learning too many things simultaneously, but i know now.






Re: Baked-in paths

2010-03-15 Thread John Calcote

Hi Reuben,

On 3/14/2010 4:29 PM, Reuben Thomas wrote:

I imagine this question has been asked before, but searching the
archives, the net and the manual didn't help me with the following:

I have a C program which loads some scripts at runtime. These are
stored in datadir (e.g. /usr/local/share/prog). But I also want to be
able to run the program from its build directory.

At the moment I bake in the relevant path; I imagine that for make
install I have to rebuild the binary, baking in the installation
path, having baked the build directory path in a normal make.

Is there a recommended way to deal with this situation?

   


The software packages I've worked on have all been complex enough to 
warrant a configuration file in the system config directory (/etc, 
whatever), which is locked down by virtue of its location. Since only 
the administrator can change the contents of a file in $(sysconfdir), 
it's not a security issue.


But what this does allow me to do is generate an installed configuration 
file with reasonable default paths derived from $(prefix) variables. At 
the same time, I bake default paths into the application, which are also 
derived from $(prefix) variables and passed on the compiler command 
line. This allows me to run without a config file, as long as the app is 
properly installed. Finally, the key benefit is that I can run my 
program in test mode by supplying an appropriate config file on the 
command line (./program -c testconfig.ini).


I love the flexibility this system provides, and I don't see any 
security issues with it, unless your program must be run as root in 
order to do what it needs to do. In that case, it's not safe to execute 
it during make check anyway. But it can be executed during make 
installcheck.


John





Re: Public header files

2010-03-03 Thread John Calcote

Hi Jef,

On 3/3/2010 11:53 AM, Ben Pfaff wrote:

Jef Driesenjefdrie...@hotmail.com  writes:

   

It works fine for every system I have access too, but long long is not
standard and thus not guaranteed to be present. So I just want to make
sure it will work on other systems too.
 

long long has been standard for 11 years now.  It is irritating
that some vendors have apparently not updated their C compilers
in that long.
   


While I agree that standards should be followed, I find this one 
distasteful. I mean, long long? Is that supposed to be someone's idea 
of a scalable solution? What happens when we have 128-bit systems? Dare 
I venture: long long long? And please don't say we'll never have 
128-bit systems. We've been down that road before; we know where it leads.


Personally, I like the idea of using int64_t and uint64_t. The exact 
same standard already defines such types for the more commonly used 
sizes, and it is scalable.


John





Re: Public header files

2010-03-03 Thread John Calcote
Sorry - I addressed this note to Jef. It should have gone to Ben. My 
apologies.


On 3/3/2010 12:16 PM, John Calcote wrote:

Hi Jef,

On 3/3/2010 11:53 AM, Ben Pfaff wrote:

Jef Driesenjefdrie...@hotmail.com  writes:


It works fine for every system I have access too, but long long is not
standard and thus not guaranteed to be present. So I just want to make
sure it will work on other systems too.

long long has been standard for 11 years now.  It is irritating
that some vendors have apparently not updated their C compilers
in that long.


While I agree that standards should be followed, I find this one 
distasteful. I mean, long long? Is that supposed to be someone's 
idea of a scalable solution? What happens when we have 128-bit 
systems? Dare I venture: long long long? And please don't say we'll 
never have 128-bit systems. We've been down that road before; we know 
where it leads.


Personally, I like the idea of using int64_t and uint64_t. The exact 
same standard already defines such types for the more commonly used 
sizes, and it is scalable.


John







Re: Public header files

2010-03-03 Thread John Calcote

On 3/3/2010 12:53 PM, Russ Allbery wrote:

John Calcotejohn.calc...@gmail.com  writes:

   

While I agree that standards should be followed, I find this one
distasteful. I mean, long long? Is that supposed to be someone's idea
of a scalable solution? What happens when we have 128-bit systems? Dare
I venture: long long long? And please don't say we'll never have
128-bit systems. We've been down that road before; we know where it
leads.
 

Usually by the time one gets to the point of standardizing something, it's
both too late to fix the aesthetics and aesthetics are the least of
anyone's concerns.  A lot of things that make it into standards are
widespread existing practice before then, and it's too much work to change
them.

I suspect this is part of why, as you point out, the standard also
introduces intsize_t at the same time, but long long is more widely
supported, probably because it's older than the standard.
   


Of course you're right Russ.

John





Re: cross-compiling on 64 to 32-bit Linuxlocalhost/

2010-03-02 Thread John Calcote

Hi Gregory,

On 3/2/2010 4:14 PM, Grégory Pakosz wrote:

  ./configure --host=i686-pc-linux-gnu \
   --prefix=/arch/x86-linux/gnu \
   CC=gcc -m32 -march=i586 \
   CXX=g++ -m32 -march=i586 \
   LDFLAGS=-m32
 

I'm curious about why setting --host=i686-pc-linux-gnu is not enough
to achieve cross compiling and why in that case it's not up to
autoconf to add -m32 to CC.
   


You don't need to specify -m32 if you have a tool set prefixed with the 
cross tag. The reason for using -m32 is because the user wants to use 
his 64-bit gcc to compile 32-bit code, so he has to tell the compiler to 
switch to 32-bit mode also. (Incidentally, if you're running on Linux, 
might also be a good idea to tell the compiler you're running in a 
32-bit environment by executing gcc with linux32).


Another way to use your 64-bit gcc without special compiler flags is to 
create scripts, named with the cross prefix, in your bin directory that 
execute the compiler in 32-bit mode (and perhaps also executed by 
linux32). Then these tools will be preferred by Autoconf when you use 
--host=.


Regards,
John





Re: split check target into check and test targets

2010-02-24 Thread John Calcote

On 2/24/2010 1:50 AM, Baurzhan Ismagulov wrote:

On Tue, Feb 23, 2010 at 04:05:47PM -0800, Daily, Jeff A wrote:
   

I attempted to split the make check target into make check (build
check_PROGRAMS) and make test (run check_PROGRAMS). However, I get
warnings that check-am was overridden. How might I split the building
and running of check_PROGRAMS and still use the generated
parallel-tests using TESTS? Thanks.
 

There are also these ways:
http://www.opensubscriber.com/message/automake@gnu.org/2136673.html
   


Additionally, if I want to build a particular check program (perhaps as 
I'm working out the compiler errors, but before I'm ready to actually 
run the tests), I just type make check-program-name from that 
directory. Alexander's solution is great, though. I'm going to use that 
one myself.


Regards,
John




Re: Creating a partial library

2010-02-06 Thread John Calcote

Hi Ralf,

On 2/6/2010 9:32 AM, Ralf Wildenhues wrote:

Hello,

to round up a couple of minor bits here:

* John Calcote wrote on Wed, Feb 03, 2010 at 05:57:49PM CET:
   

The trouble with LIBRARIES is that it only builds non-PIC static
libraries, which can't be linked into a libtool shared library. My
example has a couple of minor flaws that I realized last night after
sending it, including the missing uninstall-local rule:

install-exec-local:
 libtool --mode=install ./install-sh -c libhello.a 
$(DESTDIR)$(libdir)/libhello.a

uninstall-local:
rm -f $(DESTDIR)$(libdir)/libhello.a
 

If you are using Automake (and on this list, I guess that's pretty much
given), then please use the variables to pick up the in-tree libtool
script as well as the usual flag variables for the makefile.am author
and the user:
   


Thanks for cleaning it up for me. And the tip about adding custom rules 
as dependencies of the standard rule is priceless.


:)

John




Re: Creating a partial library

2010-02-03 Thread John Calcote

Steffan,

On 2/3/2010 5:50 AM, Steffen Dettmer wrote:

On Wed, Feb 3, 2010 at 8:33 AM, John Calcotejohn.calc...@gmail.com  wrote:
   

(PIC-based static only) library is to use the noinst prefix. But libtool
can be used to manually install a convenience library, so you could use
libtool to do this in an install-exec-local rule in the Makefile.am file
that builds (for instance) libhello.a (untested):

install-exec-local:
libtool --mode=install ./install-sh -c libhello.a
$(DESTDIR)$(lib)/libhello.a

This example came from the libtool manual (modified slightly for Automake
context).
 

ohh this is interesting. Isn't this breaking `make uninstall' and
thus `make distcheck'?  Would it be possible (better/suited/correct)
to have some lib_LIBRARIES=libother.a with a custom build rule
that simply copies the file? Then make install/uninstall could
work, but maybe this breaks other things?
   


The trouble with LIBRARIES is that it only builds non-PIC static 
libraries, which can't be linked into a libtool shared library. My 
example has a couple of minor flaws that I realized last night after 
sending it, including the missing uninstall-local rule:


install-exec-local:
libtool --mode=install ./install-sh -c libhello.a 
$(DESTDIR)$(libdir)/libhello.a

uninstall-local:
rm -f $(DESTDIR)$(libdir)/libhello.a

John




Re: Creating a partial library

2010-02-02 Thread John Calcote

Hi Justin,

On 2/2/2010 6:39 PM, Justin Seyster wrote:


I'm pretty sure that making the framework a convenience library is my
ideal solution: the plug-in author will be able to distribute a single
shared object without any non-standard dependencies.  However, I read
that Automake does not allow installing a convenience library.  I
verified that a regular static library (not specified with
noinst_LTLIBRARIES) definitely does not work: the resulting .a file is
not position independent and won't link into a plug-in.  I don't want
to use noinst_LTLIBRARIES, though, for the simple reason that I want
to be able to install the library!
   


A convenience library is so named because it provides the convenient 
effect of encapsulating a set of code that multiple products can link to 
within a package. That's it's only purpose - to bundle up code that can 
then be linked into multiple products within your package. Hence, it's 
not intended to be consumed outside of your package.


You're correct in understanding that a non-libtool (*_LIBRARY) product 
is a non-PIC static library, and you are also correct in understanding 
that they can't be linked into a shared library for this reason.


You can build installed LTLIBRARIES as shared libraries, static 
libraries, or both, and you can configure the default in your 
configure.ac file, but unfortunately, I haven't found a way to make this 
default apply only to a subset of the installed LTLIBRARIES built within 
a package. It appears to be all or nothing. Furthermore, it's just a 
default. The user can always override your defaults with command line 
options to configure. Personally, I think it would be a nice enhancement 
to Automake to allow you to specify that you want specifically to build 
an installed static (only) LTLIBRARY that can then be linked into a 
shared library in another package. The Libtool manual states that 
there's no good reason for doing this, but you and I have both 
encountered situations when we want to do just this.


For the reasons outlined above, automake doesn't allow you to build a 
convenience library and install it. The only way to build a convenience 
(PIC-based static only) library is to use the noinst prefix. But 
libtool can be used to manually install a convenience library, so you 
could use libtool to do this in an install-exec-local rule in the 
Makefile.am file that builds (for instance) libhello.a (untested):


install-exec-local:
libtool --mode=install ./install-sh -c libhello.a 
$(DESTDIR)$(lib)/libhello.a


This example came from the libtool manual (modified slightly for 
Automake context).


Regards,
John




Re: silent installs

2010-01-29 Thread John Calcote

On 1/29/2010 10:17 AM, Steffen Dettmer wrote:

On Fri, Jan 29, 2010 at 5:21 PM, Bob Friesenhahn
bfrie...@simple.dallas.tx.us  wrote:
   

Regarding silent installs: Why do passenger trains have windows?
 

Why do passenger train windows have curtains?
SCNR :)
   


Okay - I can't help it! I bet the engineer's windows don't have curtains.

John




Re: Asking for a tutorial to solve two different cases

2009-10-15 Thread John Calcote

Hi Glus,

On 10/15/2009 9:41 AM, Glus wrote:

I'm developping an application. The question is that once installed I'd like
to find it hanged to my Gnome general applications menu. For this, I'm
searching the info about how should I configure the autotools files project.

I'd like to take the opportunity to ask you also the same but in the case of
a server. How should I do to update /etc/services, /etc/rc.Xd/, to transfer
permissions...   ???
   


Neither of these situations is handled directly by either Autoconf or 
Automake. These are really very much OS- and desktop-specific issues. 
You'll have to research various other forums to find out more about them.


I suggest looking at the Gnome desktop forums at 
http://gnomesupport.org/forums to ask about what sorts of files need to 
be installed and where in order to add menu items and links.


While it's true that installing daemons and server software is more a 
general Unix topic, the way these types of services are installed and 
manipulated is very different from one Unix to another. Linux has only 
recently standardized some aspects of this activity. For installing 
Linux services, you might check with linuxforums.org, or forums 
associated with your particular flavor of Linux (Ubuntu, Suse, Redhat, 
Gentoo, Slackware, Debian, etc. All of these have developer forums of 
their own).


Once you have the specific information on what files to install where, 
then you can write hand-coded rules in Automake Makefile.am files to put 
these files in the right locations. Be aware, however, that the more of 
this activity you do in your Automake makefiles, the less portable 
they'll likely be.


Regards,
John




Re: Difficulty cross-compiling

2009-10-12 Thread John Calcote

Hi William,

On 10/12/2009 12:26 PM, William Tracy (wtracy) wrote:

I'm trying to cross-compile a library that uses GNU Autotools (Google
Coredumper, to be specific) for PPC using the MontaVista tool chain. The
sequence of commands I'm following is:

$ ./configure --host=ppc CC=/path/to/gcc CXX=/path/to/g++

$ make

$ [next step would normally be make install]

For a normal (not cross-compile) configuration, issuing make causes a
.libs/ directory to be generating containing .a, .o, and .so files (and
variants thereof). When I cross-compile for PPC, the directory is
created and populated with a .a file, but no .so files. I can see .o and
.lo files being generated, so the code *is* getting compiled, but the
linking stage gets skipped. When I review the make output, there are no
error messages or warnings-the commands related to the .so files are
simply missing.
   


You don't state your target OS, only the CPU architecture, and I'm not 
familiar with MontaVista, so I'm not sure I'm helping here. But if your 
target platform is AIX, or anything like it, you may be experiencing 
AIX-library-naming-difference syndrome - a sense of disorientation 
associated with not being able to find your AIX shared libraries after 
building. ;-)


The default library extension for some versions of AIX is .a. These .a 
files contain the equivalent of standard Unix static /and dynamic/ 
libraries. Thus, on AIX/PPC, .a files are dynamically loaded just like 
.so files on Solaris or Linux. The .a files also contain the static 
objects linked into a binary when static linking is requested.


Regards,
John





Re: make dist and make distcheck trouble

2009-09-28 Thread John Calcote

Bruce,

On 9/28/2009 7:03 PM, David Bruce wrote:

Hello Ralf,

I found it!  It was an utterly perversely subtle typographical error:

In my src/Makefile.am:

tuxmathserver_SOURCES = servermain.c\
server.c \
mathcards.c \
throttle.c  \
.   options.c

^
|
|


A stray '.' got inserted in the file list, which lead to the entire
   


Dang! I'm sorry. I noticed that period, but a period is so small that I 
almost thought it was dust on my monitor! Then I looked a second time 
and realized it was a period, but figured it must have been 
inadvertently inserted during cut and paste into your email. If I'd 
responded when I thought to, I might have saved you some time.


Regards,
John




Re: make dist and make distcheck trouble

2009-09-28 Thread John Calcote

On 9/28/2009 7:09 PM, John Calcote wrote:

Bruce,

On 9/28/2009 7:03 PM, David Bruce wrote:


Sorry David, then I went and got your first and last names mixed up. 
Perhaps I'd better just be quiet now. ;)





Re: Dependency issues after adding new sources.

2009-09-11 Thread John Calcote

Hi Dave,

On 9/11/2009 9:24 AM, Dave Steenburgh wrote:

Please excuse my ignorance, but my search fu is weak, and I think the
authors of tfm are conspiring to bewilder me.  I have read several tutorials
and discussions on how to use the autotools, but to be honest my
understanding of them is extremely limited at best.
I have this problem with several of my programs, but it's most frustrating
with my current program, which at the moment has a flat directory structure.
  The program in question is developed little by little, so from time to time
I need to add new source files to the program's _SOURCES in Makefile.am.  I
was under the impression that after doing so, running make from the build
directory would magically figure everything out and build the program
correctly.  What happens instead is the Makefile appears to be regenerated,
but my new sources are not included in it.  I have tried multiple methods to
fix this, most of them to no avail.  Currently, the new sources are built
and are linked into the executable, but most of the old sources aren't being
rebuilt when a common header is changed.  The only thing that fixes all the
issues is to start with an empty build directory and re-run the configure
script.  I doubt that it's really necessary to create a new build directory
every time I add a new class.  So what could I be doing wrong?  I will
gladly share any information about my build environment that may help a
diagnosis, but I'd prefer to keep the code private.
   


Please share at least one of your Makefile.am files with us - preferably 
the one containing the _SOURCES directive that you modified.


Thanks,
John





Re: installing glade3 interface file with autotools

2009-08-17 Thread John Calcote

Hi Mick,

Your Automake syntax is correct, if you're trying to install a 
pre-existing data file called nuchimp.xml into the /usr/local/share 
(default $(datadir)) directory. The error you're getting indicates that 
make can't fine the file nuchimp.xml. Are you sure it exists in the same 
directory as the Makefile.am file you've shown? That's where it should 
be without a relative path prefix.


Regards,
John

On 8/17/2009 1:22 AM, Mick wrote:

I'm trying to rebuild my application using the current(ish) glade and
autoconf/automake and have been having a nightmare trying to get the
XML file created from glade by:
gtk-builder-convert nuchimp.glade nuchimp.xml

After reading various documents I have the following Makefile.am:
bin_PROGRAMS = nuchimp

nuchimp_SOURCES = \
callback.c callback.h \
chimp.c chimp.h \
main.c main.h \
parsecfg.c parsecfg.h

xmldir = $(datadir)
xml_DATA = nuchimp.xml

AM_CPPFLAGS = $(GTK_CFLAGS)
AM_LDFLAGS = $(GTK_LIBS) -export-dynamic

the clearest doc stated the two lines beginning xml would do exactly
what I need but I make, I get:
make[2]: *** No rule to make target `nuchimp.xml', needed by `all-am'.
Stop.

this is doing my head in so PLEASE, someone wiser than me, point me to a
clear explanation of the process.


___
Autoconf mailing list
autoc...@gnu.org
http://lists.gnu.org/mailman/listinfo/autoconf

   






Re: simple 'install of png file' question

2009-08-11 Thread John Calcote

Hi David,

On 8/11/2009 7:28 AM, David Liebman wrote:

Hello,

This is a newbie question.

I have a simple project that I'm using automake and autoconf on. It
involves a simple c program, but uses a png image. The png image is in a
directory called 'pics' and I want it copied to a certain directory on
the system when the user calls the 'install' target.

I suspect the proper thing to do is to make a Makefile in the 'pics'
directory and have that makefile install the png. How do I go about
doing that? Does anyone have a link to a good example of how this is
done? If I am creating a Makefile in the pics directory I would like the
Makefile to be auto-generated.
   


You may create a new Makefile.am file in the pics directory if you wish, 
or you don't have to. Here's how to do this if you don't create a new 
Makefile.am. Add this code to the parent directory's Makefile.am:


picdir = $(datadir)/pics # assuming you want png's installed in 
/usr/local/share/pics

pic_DATA = mypicture.png

That's it! Regarding the location where you want to install - try to use 
standard places if you can, but if you can't then try to build on 
standard places defined in the automake provided environment variables.


Regards,
John




Re: library dependencies SUBDIR problem automake

2009-08-04 Thread John Calcote

Hi Michiel,

On 8/4/2009 10:01 AM, Michiel Soede wrote:

Hi,
I have a problem with dependencies to libraries in my build structure.the 
directory structure in my project is as follows (roughly):

configure.acmakefile.amapps/ makefile.am app1/   
main.cc   makefile.am   comps/ makefile.am 
comp1/   comp1.cc   makefile.am
the component makefile generates a library:noinst_LIBRARIES = 
libcomp1.alibcomp1a_a_SOURCES = comp1.cc
I recurse in all subdirectories, using SUBDIRS:SUBDIRS = apps comps
in the comps:SUBDIRS = comp1
same for the apps directory.
the app1 uses LDADD to link the app with main.cc
bin_PROGRAMS = app1app1_SOURCES = main.ccapp1_LDADD = 
$(top_builddir)/comps/comp1/libcomp1.a
Now when I call make at the root, everything is build correctly.
But when I cd into apps/app1/ and call make, I have problems with:
- if comp1 was not made before (e.g. from the root), make will fail  (no rule 
to make libcomp1.a)
- if I did make the library at the root, it is not recompiled automatically 
when  I modify comp1.cc
.. any ideas on these problems?
   
The lack of proper text wrapping on your message made it a bit difficult 
to see your directory structure, but I think I've sorted it out, based 
on your other comments:


configure.ac
makefile.am
apps/
  makefile.am
  app1/
main.cc
makefile.am
comps/
  makefile.am
  comp1/
comp1.cc
makefile.am

According to your makefiles, app1 is dependent on comp1 (app1_LDADD = 
.../libcomp1.a), but comp1 is not built as a sub-component 
(sub-directory) of app1. Thus, (I believe you are saying) when you build 
from the project root, everything is fine, but when you build from the 
app1 directory, comp1 doesn't get built, and thus the app1 build fails. 
Is this correct?


A recursive build system must be designed to build component 
dependencies first before building the components. Thus, one limitation 
of a recursive build system is that it rather defines (or at least 
constrains) the directory structure that you must use. To get comps to 
be built before apps from within the app1 directory, you must build the 
comps directory structure from within the app1 directory.


I recognize that applications 2-n may also use components 1-n, so you 
have a problem here, and the only way around it is to use a 
non-recursive build system. That is, you can place all of your build 
logic in the top-level Makefile.am file using relative paths, and then 
the makefile dependencies will be properly structured for you by Automake:


Makefile.am:
= = = = = = = = = = =
bin_PROGRAMS = app1 app2 app3 ...
app1_SOURCES = apps/app1/main.cc
app2_SOURCES = apps/app2/...
...

noinst_LIBRARIES = libcomp1.a libcomp2.a libcomp3.a ...
libcomp1_a_SOURCES = comps/comp1/comp1.cc
libcomp2_a_SOURCES = comps/comp2/...
...

app1_LDADD = libcomp1.a
= = = = = = = = = = =

I also noted that you had

SUBDIRS = apps comps

in your top-level Makefile.am file. This is wrong - the comps hierarchy 
should be built before the apps hierarchy, or else there will still be 
no libraries in the comps hierarchy for the apps to link against, unless 
you've manually built the comps directory first, and then attempted to 
build from root.


Regards,
John





Re: EXTRA_DIST respects Automake conditionals?

2009-07-30 Thread John Calcote

Hi Ben,

The reason this works is because of the way AM Conditionals are 
implemented. If a conditional is true, then all contained statements are 
in effect. Otherwise, all contained conditions are commented out in the 
resulting makefile.


Regards,
John

On 7/29/2009 8:57 PM, Ben Pfaff wrote:

I was surprised today to discover that EXTRA_DIST respects
Automake conditionals.

In other words, if I have the following Makefile.am:

 AUTOMAKE_OPTIONS = foreign

 EXTRA_DIST =

 if COND
 bin_PROGRAMS = foo
 foo_SOURCES = foo.c
 EXTRA_DIST += EXTRA
 endif

and configure.ac:

 AC_INIT([mumble], [1.0])
 AC_CONFIG_SRCDIR([foo.c])
 AC_CONFIG_FILES([Makefile])
 AM_INIT_AUTOMAKE
 AC_PROG_CC
 AM_CONDITIONAL([COND], [false])
 AC_OUTPUT

then make dist will not put EXTRA into the generated tarball.
It will put foo.c into the tarball, though.

Is there an appropriate target to put files that should always be
distributed, regardless of conditionals?  noinst_HEADERS works,
but to me it feels like abuse to use it for this purpose.

For what it's worth, in the actual project where I encountered
this, the usage is more like this:

 if ENABLE_USERSPACE
 ...
 include lib/automake.mk
 include ofproto/automake.mk
 include utilities/automake.mk
 include tests/automake.mk
 include include/automake.mk
 include third-party/automake.mk
 include debian/automake.mk
 include vswitchd/automake.mk
 include xenserver/automake.mk
 if HAVE_CURSES
 if HAVE_PCRE
 include extras/ezio/automake.mk
 endif
 endif
 endif

In other words, I'm using a conditional to disable a great many
features, and it's convenient not to push that conditional down
into all the included files.

Here's the Makefile.am in question:
http://openvswitch.org/cgi-bin/gitweb.cgi?p=openvswitch;a=blob;f=Makefile.am;h=dccb8cfdf92a3dd4dc9f3276e7533f68769587f8;hb=c2b070214097fa40dc78252882d96babe7fab4b4

Thanks,

Ben.
   






Re: 答复: how to install library in a s pecific directory?

2009-07-24 Thread John Calcote

On 7/23/2009 7:45 PM, A117 wrote:

Sorry I forgot to mention the files in EXTRA_DIST are to be packed into release 
package. All the cpp files mentioned here already exists and are to be compiled 
and released.
If I put all cpp files in _SOURCES, the EXTRA_DIST files are not released. The 
only way I've found is to put only one cpp file in _SOURCES, while others to 
EXTRA_DIST and add .o to _LIBADD. My goal is to build all the .cpp and put all 
.cpp, .txt, .rc into release package.
   


Okay, I understand the problem now. But there's not enough information 
in the snippet you posted to determine the cause. Can you post the 
entire Makefile.am file? Or at least a smaller example that reproduces 
the problem. There's nothing that I can see in the portion of your file 
that you posted that would be the cause of such a problem.


John


-
And can I ask another question? I want to build some source code files into 
this library and need to distribute some other files, too. But EXTRA_DIST in 
Makefile.am does not work as below,
lib_LTLIBRARIES = libezcommon.la
...
libezcommon_la_SOURCES = ezcommon.cpp tinystr.cpp ...
EXTRA_DIST = tinyxml.txt ezcommon.rc ...

If I write only one file in libezcommon_la_SOURCES, while adding others to 
EXTRA_DIST and others' .o to libezcommon_la_LIBADD, it works. I don't know why.

   






Re: how to install library in a specific directory?

2009-07-24 Thread John Calcote

On 7/24/2009 12:21 AM, A117 wrote:

lib_LTLIBRARIES = libezcommon.la
myincludedir = $(includedir)/ezproject
myinclude_HEADERS= ezcommon.h tinystr.h tinyxml.h
libezcommon_la_SOURCES = ezcommon.cpp \
tinystr.cpp tinyxml.cpp tinyxmlerror.cpp tinyxmlparser.cpp
#libezcommon_la_LIBADD = tinystr.o tinyxml.o tinyxmlerror.o tinyxmlparser.o
libezcommon_la_LDFLAGS = -version-info 2:0:0
#EXTRA_DIST = tinystr.cpp tinyxml.cpp tinyxmlerror.cpp tinyxmlparser.cpp \
EXTRA_DIST = ezcommon.rc dllmain.cpp ezcommon.aps ezcommon.vcproj icon.ico \
tinyxml.txt resource.h stdafx.h targetver.h
   


Try replacing the TAB characters at the beginning of your wrapped lines 
with spaces. And also make sure you don't have any white-space following 
any of the back-slashes at the end of wrapped lines.


John


- -

On 7/23/2009 7:45 PM, A117 wrote:
   

Sorry I forgot to mention the files in EXTRA_DIST are to be packed into release 
package. All the cpp files mentioned here already exists and are to be compiled 
and released.
If I put all cpp files in _SOURCES, the EXTRA_DIST files are not released. The 
only way I've found is to put only one cpp file in _SOURCES, while others to 
EXTRA_DIST and add .o to _LIBADD. My goal is to build all the .cpp and put all 
.cpp, .txt, .rc into release package.

 


Okay, I understand the problem now. But there's not enough information
in the snippet you posted to determine the cause. Can you post the
entire Makefile.am file? Or at least a smaller example that reproduces
the problem. There's nothing that I can see in the portion of your file
that you posted that would be the cause of such a problem.

John

   

-
And can I ask another question? I want to build some source code files into 
this library and need to distribute some other files, too. But EXTRA_DIST in 
Makefile.am does not work as below,
lib_LTLIBRARIES = libezcommon.la
...
libezcommon_la_SOURCES = ezcommon.cpp tinystr.cpp ...
EXTRA_DIST = tinyxml.txt ezcommon.rc ...

If I write only one file in libezcommon_la_SOURCES, while adding others to 
EXTRA_DIST and others' .o to libezcommon_la_LIBADD, it works. I don't know why.


 




   




Re: how to install library in a specific directory?

2009-07-23 Thread John Calcote

On 7/22/2009 9:15 PM, A117 wrote:

Thank you. I've decided to put the library in /usr/local/lib, while its header 
files in /usr/local/include/ezproject.
It's strange though /usr/local/lib is in /etc/ld.so.conf (actually in another 
file it includes), and I can build other programs acting much as mine, I have 
difficulty with mine only. I run ldconfig manually and then it works. Now I'm 
releasing my software.
   


ldconfig updates the library cache. the /etc/ld.so.conf file is the file 
used by ldconfig to determine which directories to scan for libraries. 
When you add a new library to one of the directories in /etc/ld.so.conf, 
then you need to run ldconfig to ensure that the cache is aware of that 
new library.


John





Re: how to install library in a specific directory?

2009-07-23 Thread John Calcote

On 7/23/2009 4:28 AM, A117 wrote:

Why don't I need to run ldconfig manually after installing other official softwares, like osip2? I 
tried ldconfig -p and saw the library was aware of, i.e., listed, before running 
ldconfig. But linkage could not find the library then.
   


Linux distro installers execute ldconfig for you. If you look carefully, 
you'll see that the last stage of installing software (either new or 
updated) is to run ldconfig. On my Opensuse 11.1 Linux installation, 
when I install new software packages from within YaST, I can see 
ldconfig's output just before the installation process completes. If you 
use RPM manually, you may have to run ldconfig yourself after installing 
packages in order to pick up the changes immediately.



And can I ask another question? I want to build some source code files into 
this library and need to distribute some other files, too. But EXTRA_DIST in 
Makefile.am does not work as below,
lib_LTLIBRARIES = libezcommon.la
...
libezcommon_la_SOURCES = ezcommon.cpp tinystr.cpp ...
EXTRA_DIST = tinyxml.txt ezcommon.rc ...

If I write only one file in libezcommon_la_SOURCES, while adding others to 
EXTRA_DIST and others' .o to libezcommon_la_LIBADD, it works. I don't know why.
   


I'm  sorry, I don't understand the question. Are you trying to generate 
sources, or optional build some sources? Perhaps a bit more context on 
the problem...



Thanks for your patience.

- -
   

On 7/22/2009 9:15 PM, A117 wrote:
 

Thank you. I've decided to put the library in /usr/local/lib, while its header 
files in /usr/local/include/ezproject.
It's strange though /usr/local/lib is in /etc/ld.so.conf (actually in another 
file it includes), and I can build other programs acting much as mine, I have 
difficulty with mine only. I run ldconfig manually and then it works. Now I'm 
releasing my software.

   

ldconfig updates the library cache. the /etc/ld.so.conf file is the file
used by ldconfig to determine which directories to scan for libraries.
When you add a new library to one of the directories in /etc/ld.so.conf,
then you need to run ldconfig to ensure that the cache is aware of that
new library.

John
 




   




Re: how to install library in a specific directory?

2009-07-22 Thread John Calcote

On 7/22/2009 2:12 AM, bonami wrote:

   I have two projects. One generates a shared library and the other uses it.
The library is to be installed in /usr/local/lib/ezproject, while the
project's name is ezcommon. I have problems in both the projects.
   ezcommon's configure.ac,
…
AC_INIT(ezcommon, 3.0)
AC_DISABLE_STATIC
…
   ezcommon's Makefile.am,
lib_LTLIBRARIES = libezcommon.la
libdir = $(exec_prefix)/lib/ezproject
includedir = $(prefix)/include/ezproject
   Problem is, if user configure --libdir=..., the user's definition will be
ignored. How can I set libdir only when user does not assign it? (Since this
dir name is not same as project name, I cannot use pkglib_.)
   


mylibdir = $(libdir)/ezproject
mylib_LTLIBRARIES=libezcommon.la

myincludedir = $(includedir)/ezproject
myinclude_HEADERS = ...


   The other question is how to check for this library in the other project,
named ezcmd.
   ezcmd's configure.ac,
...
AC_CHECK_LIB([ezcommon], main,,AC_MSG_ERROR(...))
...
   This check will fail, since ezcommon is installed in /usr/local/lib/
ezproject by default. Should I add LDFLAGS=-Lezproject or sth.? And how? Or
should I add this directory to system's link-search directory?
   


If you put your libraries in a non-standard location, then you'll have 
to add that location to the library search path in one way or another. 
Either of the options you mention will work. One other option is to 
generate and install a pkgconfig description file, then use pkgconfig to 
locate the library.


Regards,
John




docdir with packages directory

2009-07-10 Thread John Calcote

Hi all,

Just a quick question for those who might know.

I understand that Automake generates ${docdir} as 
${datarootdir}/doc/${PACKAGE}, as per the Automake manual, section 2.2.3.


While messing around building RPM files recently, I happened to notice 
that most Linux distros like to put documentation files into 
${datarootdir}/doc/*packages*/${PACKAGE}.


Now, I realize that you can modify the ${docdir} on both the configure 
and make command lines so that it contains the correct values for 
building a proper RPM. But I have a couple of questions:


1) Why the controversy?

2) Is there any movement within the Automake community to move toward 
the LSB directory structure as the default?


Thanks in advance,
John


Re: docdir with packages directory

2009-07-10 Thread John Calcote

Hi Russ,

On 7/10/2009 12:32 PM, Russ Allbery wrote:

While messing around building RPM files recently, I happened to notice
that most Linux distros like to put documentation files into
${datarootdir}/doc/*packages*/${PACKAGE}.
 


Debian doesn't.
   


My mistake here - sometimes I leap before I look. It appears that some 
RPM-based distros use /usr/share/doc/${PACKAGE} and some use 
/usr/share/doc/${PACKAGE}-${VERSION}. Opensuse uses 
/usr/share/doc/packages/${PACKAGE}, but since I've started researching, 
I've not found any references to other Linux distros that use this.


Sorry for the confusion...

John


Re: docdir with packages directory

2009-07-10 Thread John Calcote

On 7/10/2009 5:42 PM, Andrew W. Nosenko wrote:

On Fri, Jul 10, 2009 at 20:52, John Calcotejohn.calc...@gmail.com  wrote:
   

2) Is there any movement within the Automake community to move toward the
LSB directory structure as the default?
 


Excuse me, but why automake should prefer one of the many OS and,
therefore, one of the many FS layouts over other (except automake's
own POV) ?

Please understand, LSB-based FS layout is non-common (the same as LSB
itself) even in the Linux world.  Why do you expect that LSB will be
followed somewhere outside?  (Please, remember that Automake works in
more environments that just a Linux, even in all Linux flavors and
distros).
   

Hi Andrew,

Yes, you are correct. I have already answered previous responses to my 
original message. As it turns out, LSB doesn't even specify a particular 
docdir layout. The proper specification comes from the File System 
Hierarchy standard. FSH states that package directories go directly into 
/usr/share/doc, which is exactly as Automake does it.


My particular issue was specific to SuSE Linux - for some reason, they 
chose to place package directories within a packages directory beneath 
the proper docdir. I just didn't realize that this was specific to SuSE 
Linux. Live and learn.


Thanks for the feedback.

Regards,
John



Re: subdirs that should not be configured

2009-07-09 Thread John Calcote

Hi Nicolas,

On 7/9/2009 11:13 AM, Nicolas Bock wrote:

Hello list,

I have the following problem: I want to add an external package to our
own package. The external package is already configured in the sense
that its maintainer ran make dist and I simply untared the tar file
into some directory underneath our source tree. I add that directory to
AC_CONFIG_SUBDIRS in our main configure.ac and to SUBDIRS in our main
Makefile.am so that when I run our configure script I will also run the
configure script of the source package. The problem I have now is that
when I run autoreconf it descends into the external package and tries to
remake the configure script in there. How can I avoid that from
happening?
   


You can use the autoreconf --no-recursive option. This will autoreconf 
only the current directory. If you have sub-projects that need to be 
recursed into, but you need to skip this one directory, then you can 
create a simple shell script that contains simply:


autoreconf --no-recursive . subdir1 subdir2...

Then, run this shell script to autoreconf only the subdirectories you want.

The real problem here is a mixture of paradigms. You want to treat some 
directories (your own) as maintainer code, and other directories (your 
third-party directories) as user code.


Regards,
John




Re: -I. in DEFAULT_INCLUDES

2009-07-06 Thread John Calcote

Hi Bob,

On 7/6/2009 5:24 AM, Bob Ham wrote:

Hi there,

I have a problem due to conflicts between local and system header
filenames.  This problem comes about because of the addition of -I. to
the CXXFLAGS of any objects.  I've traced this to a variable called
DEFAULT_INCLUDES in every Makefile.in:

   DEFAULT_INCLUDES = -...@am__isrc@ -I$(top_builddir)


Why does this -I. exist?  How can I remove it?
   


DEFAULT_INCLUDES actually resolves to:

DEFAULT_INCLUDES = -I. -I$(srcdir) -I$(top_builddir)

That is, the current directory, the source directory (if building 
outside the source tree), and the top build directory (in order to pick 
up config.h or other project-global headers.


It is assumed that there would be no header files in the current or 
source directory that are not *more* important (and should thus be 
picked up first) than any other header files outside the project.


Just curious - under what conditions do you have a header file in the 
local directory that you need to have overridden by a globally installed 
header file?


Regards,
John




Re: problems with recursive make target

2009-06-29 Thread John Calcote

Hi John,

On 6/29/2009 1:44 PM, Ralf Wildenhues wrote:

Hello John,

* johnwohlb...@gmail.com wrote on Mon, Jun 29, 2009 at 09:36:09PM CEST:
   

in top/lib/Makefile.am
SUBDIRS = pika_comm pika_utilities
# provide a separate recursive target for making tests
tests : all
echo `pwd`;
for dir in $(SUBDIRS); do \
cd $$dir  $(MAKE) $(AM_MAKEFLAGS) $@ || exit 1; \
done
echo `pwd`;
.PHONY : tests
 


You don't ever 'cd' back out of the first subdirectory, so you can't
find the second:
   
Ralf is correct, of course. In my online catalog of solutions, I'd 
copied and modified this code from an Automake-generated Makefile. But I 
inadvertently left the parentheses off the cd command line, which would 
have invoked the entire line in a sub-shell:


for dir in $(SUBDIRS); do \
  (cd $$dir  $(MAKE) $(AM_MAKEFLAGS) $@ || exit 1) \
done


Sorry for the confusion.

John



Re: problems with recursive make target

2009-06-29 Thread John Calcote

John,

On 6/29/2009 2:00 PM, johnwohlb...@gmail.com wrote:

On Jun 29, 2009 1:44pm, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
Hello John, Thanks Ralf. I feel pretty dumb. You know I suspected 
that was the problem, and was trying to cd ../. But now I realize I 
was putting the cd ../ in the wrong place. After my wrongly placed cd 
../ didn't work (which I thought was rightly placed) I thought maybe 
that the example code at 
http://www.freesoftwaremagazine.com/books/agaal/catalog_of_reusable_solutions 
was correct and make would handle the cd'ing for me. Maybe I should 
file a documentation bug report with John Calcote!

Fixed! Thanks!

John




dvi bug in distcheck target?

2009-06-24 Thread John Calcote

Hi Automake maintainers,

I think there's a bug in the distcheck target related to the TEXINFO 
primary. (You may already know about it. I did a google search, but 
didn't find any references to it.)


Here's part of a sample Makefile.am from page 24 of the the Automake 
manual (1.10.2):


bin_PROGRAMS = zardoz
zardoz_SOURCES = main.c
info_TEXINFOS = zardoz.texi

Combined with a simple configure.ac file, when I run make distcheck, I 
get the following error:


...
ERROR: files left in build directory after distclean:
./zardoz.dvi
make[1]: *** [distcleancheck] Error 1
make[1]: Leaving directory 
`/home/jcalcote/dev/prj/ti-test/zardoz-1.0/_build'

make: *** [distcheck] Error 2
$

I have to add this line to the Makefile.am file to get the distcheck 
target to work cleanly:


CLEANFILES = zardoz.dvi

It appears that make clean is leaving the dvi file in place. In fact, 
when I manually execute make clean, after make dvi, I get the following 
output:


test -z zardoz || rm -f zardoz
rm -rf zardoz.aux zardoz.cp zardoz.cps zardoz.fn zardoz.fns zardoz.ky \
  zardoz.kys zardoz.log zardoz.pg zardoz.pgs zardoz.tmp \
  zardoz.toc zardoz.tp zardoz.tps zardoz.vr zardoz.vrs \
  sample.dvi sample.pdf sample.ps sample.html
rm -f *.o

It looks like the last line should contain:

  zardoz.dvi zardoz.pdf zardoz.ps zardoz.html

Regards,
John





Re: dvi bug in distcheck target?

2009-06-24 Thread John Calcote

Hi Ralf,

On 6/24/2009 12:59 PM, Ralf Wildenhues wrote:

It looks like the last line should contain:

   zardoz.dvi zardoz.pdf zardoz.ps zardoz.html
 


It would if you had
   @setfilename zardoz.info

in your zardoz.texi file.  Hmm, this is probably a bug in Automake,
but from 'info texi2dvi', I cannot even infer whether it is intentional
that @setfilename not decide the name of DVI or PDF output, and while I
think it implies to do so for HTML, I'm not fully sure either.
Wow. Sure enough. I set the texi setfilename field to zardoz.info and 
all is well again. It never occurred to me that Automake would look 
inside the texi file to determine the name of the output file, but it 
makes sense. I copied this sample file from the texinfo manual as a 
quick input file, but didn't check the contents that closely.


Thanks for the tip.

John


cross-compiling on 64 to 32-bit Linux

2009-05-23 Thread John Calcote

Hi everyone,

I was wondering what the procedure was for cross-compiling 32-bit apps 
on a 64-bin Linux system? Do you need special libraries. What 
command-line options are used? That sort of thing. I'm happy to read up 
on it, if there are references that you can point me to.


Thanks in advance,
John




Re: My project can't use `silent-rules'

2009-05-18 Thread John Calcote

Hi Bob,

On 5/17/2009 11:05 PM, Bob Friesenhahn wrote:

On Mon, 18 May 2009, Ralf Wildenhues wrote:


You can use
 AM_SILENT_RULES


Worked! Thanks!

Unfortunately, it does not silence the warnings from my code.


Forgive me, but do you really want it to do this?

Of course, if you want to permanently silence warnings from your code, 
you should probably just use appropriate pragmas or command-line options 
to disable those warnings.


One of the primary reasons (IMHO) for Automake silent rules is so that I 
CAN see the warnings in my code (without resorting to redirecting make's 
stdout to /dev/null, that is). At least, that's one reason why, for 
several years now, I've advocated an Autotools option for silent builds.


Regards,
John





Re: Setting shared lib version not functioning

2009-05-06 Thread John Calcote

On 5/6/2009 3:15 AM, Andreas Schwab wrote:

John Calcotejohn.calc...@gmail.com  writes:

   

One thing that bothers me a little is that we never really did solve
Gerald's original problem. He said his library was created just fine when
he was passing 2:0:0, but when he switched to 2:0:1, it created a library
with a version number of 1:1:0. Now, why would Libtool do this? Granted,
he didn't really want 2:0:1, but 2:0:1 isn't a bogus triplet, either. So
why did Libtool convert it to 1:1:0?
 


For the linux way of library versioning the library suffix is computed
as (current-age).age.revision, thus 2:0:1 maps to 1.1.0.  A libtool
version of 1:1:0 whould map to 1.0.1.
   
Thanks Andreas. This is excellent information, but I'd like to 
understand why this is so. Can you point me to a reference that 
describes this transformation, and perhaps it's rationale?


Thanks in advance,
John


Re: Setting shared lib version not functioning

2009-05-05 Thread John Calcote

Hi Ralf,

On 5/5/2009 2:46 PM, Ralf Wildenhues wrote:

Hello,

I think most issues were already cleared up in this thread.

* John Calcote wrote on Sun, May 03, 2009 at 06:58:09PM CEST:
   

It appears that Libtool is smart enough to detect ridiculous cases, but
it should probably throw an error of some sort, rather than simply
generate code with a different version number.
 


I agree with Jan that it is not safely possible for libtool to detect
such errors.  It detects bogus triplets such as 3:0:5, but even if it
were to look at the prior uninstalled or installed version of the
library it is about to (re)create, there is nothing that reveals whether
the triplet in the prior version was wrong, rather than the one to be
used.  So, while we could output a warning such as
   libtool: warning: library version `...' not compatible with previous `...'

I'm not sure how much good it would do.
   
When I said ridiculous cases I really meant bogus triplets. I didn't 
think there was much you could do about valid triplets that are simply 
incorrect. I should think that Libtool might fail a build if a bogus 
triplet is passed, however.


One thing that bothers me a little is that we never really did solve 
Gerald's original problem. He said his library was created just fine 
when he was passing 2:0:0, but when he switched to 2:0:1, it created a 
library with a version number of 1:1:0. Now, why would Libtool do this? 
Granted, he didn't really want 2:0:1, but 2:0:1 isn't a bogus triplet, 
either. So why did Libtool convert it to 1:1:0?


John


Re: Setting shared lib version not functioning

2009-05-03 Thread John Calcote

Hi Gerald,

On 5/3/2009 9:51 AM, Jan Engelhardt wrote:

On Sunday 2009-05-03 17:41, Gerald I. Evenden wrote:

   

libproject_la_LDFLAGS = -version-info 2:0:1

which worked fine when with previous loading of a library with 2:0:0
versioning code.

But now, when I go through the autoreconf, configure, compile and install I
get:

libproject.so.1.1.0
 


Which is absolutely correct. Either you wanted 2:1:0, or 3:0:1 (just
a guess though, I can't read minds).
Have a look at `info libtool`, section Versioning::.
   

Hmmm. Jan is correct in his analysis of your versioning strategy.

current : revision : age

You really have no reason to increment only the age value of a library 
version. What you're implying by this new version of 2.0.1 is that this 
particular instance of your library is identical in every way to the 
previous version of 2.0.0, except that this one is now backward 
compatible with a prior version (1.x.y).


If you made changes to the library, but did not in any way modify the 
interface, then you probably want to use version 2.1.0. If you modified 
the interface in a manner that's 100% backward compatible with the 
previous version, then you probably want 3.0.1. If you modified the 
interface in a manner that is NOT backward compatible (i.e., you removed 
an API function), then you probably want 3.0.0.


It appears that Libtool is smart enough to detect ridiculous cases, but 
it should probably throw an error of some sort, rather than simply 
generate code with a different version number.


Adding the libtool list.

Regards,
John


Re: Setting shared lib version not functioning

2009-05-03 Thread John Calcote

Gerald,

On 5/3/2009 12:40 PM, Gerald I. Evenden wrote:

I want to thank you all for the assistance, however I still find the libtool
manual not very illuminating.  In particular, I used section 7.3 in make my
release number and, in particular, item 5 related to adding an interface
since last release as causing an addition to age.

The big problem here is the three number do not seem independent, which
compounds the problem.  Perhaps item 3 was what should have changes but item
5 clause was also true.
   
The numbers aren't independent at all. Don't try to make the library 
interface version components into a software revision number. The scheme 
is the way it is because the OS library loader uses the values in these 
fields to determine if a particular library will support a particular 
interface request. When a program attempts to load a library versioned 
1.0.0, but the OS loader can only find 2.0.1, then the loader will go 
ahead and load 2.0.1 because the '1' in the age field indicates that 
this library supports 1.x interfaces as well as 2.x.


Jan answers the rest of your question in his response.

Regards,
John




Re: noinst_TEXINFOS

2009-04-29 Thread John Calcote

On 4/29/2009 5:27 PM, Ben Pfaff wrote:

Stefan Bienertbien...@zbh.uni-hamburg.de  writes:

   

Could it be that a primary

noinst_TEXINFOS

does not work with automake 1.10.2?
 


This seems likely.  I reported the same problem some time ago:
  http://permalink.gmane.org/gmane.comp.sysutils.automake.bugs/4046
My report did not receive any replies.
   
I believe this should work, there's no reason for it not to that I can 
think of, but I'm wondering: Why would you create texinfos, and then not 
install them? This is probably how the bug got there - it's not a very 
likely use case, is it?


John


Re: Create a custom target

2009-04-24 Thread John Calcote

See my online Autotools book at freesoftwaremagazine:

  
http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool


Chapter 4 is all about automake. This book is due to be published by No 
Starch Press in October 09.


John

On 4/24/2009 12:16 AM, automake wrote:

Hi John

Thanks a lot it worked and
it made  my day :-)

I would really appreciate if you could pass me some useful links regarding
automake.

I wonder where do we get those big target list automatically appended to
all.I would like to know more about automake.




Sure, just add it to the all-local target as a dependency, like this:

all-local: extra

--john





   






Re: Create a custom target

2009-04-23 Thread John Calcote

On 4/22/2009 8:54 PM, automake wrote:

Hi
  I have a similar problem with giving a customized target. I have included
target into Makefile.am

as

extra:
   ...nm
   ...link

But the default target for makefiles from configure.ac is all.  Is there a
way to add this target to all sub-targets?
   

Sure, just add it to the all-local target as a dependency, like this:

all-local: extra

--john




Re: Example on JNI compilation

2009-04-20 Thread John Calcote

On 4/18/2009 3:08 PM, LCID Fire wrote:
I'm currently stuck with compiling a JNI library, which java does not 
recognize. I'm not too sure about what options I have to provide to 
automake and which are already builtin. Does anybody know an example 
of how a jni lib is built using automake?

There are basically two steps to building JNI libraries:

1. Use the javah utility to generate JNI prototypes in C-language header 
files from your Java source code.
2. Compile the C-language JNI sources (including the headers generated 
in step 1) into a library.


Step 1 above is pure Java-speak, and Automake has little built-in 
functionality for it. Step 2, however is pure gcc, and Automake has no 
trouble with it. For an example of how to integrate javah operations 
into Makefile.am so you can do it all from Automake, see Chapter 6 of 
this online book at freesoftwaremagazine.com:


  http://www.freesoftwaremagazine.com/books/agaal/autotools_example

Search for the text, Building the JNI C++ sources.

Regards,
John




Re: DESTDIR vs `make install exec_prefix='

2009-04-18 Thread John Calcote

On 4/18/2009 2:32 PM, Jan Engelhardt wrote:

On Saturday 2009-04-18 22:06, Russ Allbery wrote:
   

Russ Allberyr...@stanford.edu  writes:
 

Ralf Wildenhuesralf.wildenh...@gmx.de  writes:
   

[1] I'm asking because Automake 1.11 will reliably not install files if
their respective installation directory is empty.  This is not yet
functional in Automake 1.10.2.  The test for emptiness in 1.11 will not
consider $(DESTDIR) of course, only $(bindir) etc.
 

I must have misunderstood this, since it sounds like it would potentially
break any system where each binary package is installed into a separate
tree.  For example, stow packages are routinely installed with:

 ./configure --prefix=/usr/local
 make
 make install prefix=/usr/local/stow/package-version

DESTDIR cannot easily be used here because the stow layout should have
simple bin, etc, lib, etc. directories directly under that directory, not
with an extra /usr/local in the way.
   

Oh, you mean if the value of the *variable* is empty (following the
thread), not the directory itself.  D'oh, sorry, that should have been
obvious to me from context.  Never mind.  :)
 


No, I also thought of empty directories. Quote from above: respective
installation directory is empty.. But the puzzling thing was that
when using DESTDIR=$foo, $foo would normally be empty anyways
(in automated build systems like lbuild/koji/etc.)
   
I'm sure he meant ...respective installation directory *variable* is 
empty.. Doing anything based on whether the directory itself was empty 
is meaningless, because the maintainer may have chosen to use the pkglib 
directory, for example, in which case, it will almost always be empty 
before you install - even to standard places - unless you're overwriting 
a previous installation, that is.


John


Re: Doxygen and Autotools

2009-04-13 Thread John Calcote

Hi Lorenzo,

Please see my on-line Autotools book at freesoftwaremagazine.com. It 
covers extensively the use of doxygen as an add-on to Autoconf and 
Automake.


http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool

Chapters 6 and 7 deal more specifically with the topic you're interested 
in (doxygen).


A much-updated version of this book will be published later this year by 
No Starch Press.


Regards,
John

On 4/12/2009 4:17 AM, Lorenzo Bettini wrote:

Hi

I've just started using doxygen for documenting a C++ library which 
I'm developing with autotools.


I found this macro 
http://autoconf-archive.cryp.to/ax_prog_doxygen.html and this example 
http://www.bioinf.uni-freiburg.de/~mmann/HowTo/automake.html


however, from what I understand, the macro and the example do not deal 
with installation of doxygen generated documentation, do they?


any suggestion for using doxygen with autotools?

thanks in advance
Lorenzo







Re: aclocal problems

2009-04-10 Thread John Calcote


By the way, you may be interested in seeing how I was able to use
m4_define and still get aclocal to use my file for gnulib's AC_DEFUN_ONCE
replacement (coupled with an AC_REQUIRE([gl_00GNULIB]) in gnulib-common.m4):
http://git.savannah.gnu.org/cgit/gnulib.git/tree/m4/00gnulib.m4
http://git.savannah.gnu.org/cgit/gnulib.git/commit/?id=fcf62c3d
   
Yes, I thought this might have been the trick you used: AC_DEFUN a macro 
with no expansion in the same file, just to get aclocal to include it. 
Way to work around the rules! :-)


Regards,
John



Re: Finding library procedures in /usr/local/lib/

2009-04-03 Thread John Calcote

On 4/3/2009 8:49 AM, Gerald I. Evenden wrote:

On Thursday 02 April 2009 5:56:52 pm Peter Johansson wrote:
   

Hello Gerald,

Gerald I. Evenden wrote:
 

After trying so many options related to libraries I am exhausted.

I have a simple program that needs to link with a shared library
installed in /usr/local/lib.

When using my own simple Makefile and simply adding -lproject -lm
everything works fine (libproject is the shared library).
   

LDADD = -lm -lproject

in your `Makefile.am' should do it.

Cheers,
Peter
 


Of the suggestions offered, this one worked in the following literal entry
into src/Makefile.am:

geodesic_LDADD = -lproject -lm
   
No offense intended to Peter, but this solution works because it simply 
assumes the library exists on the end-user's system. On systems where it 
doesn't exist in the default library paths, the build will fail with a 
linker error. The entire purpose of Autoconf checks is to ensure that 
the environment is actually able to build the project. If this solution 
is acceptable to you, then why even bother with configure? Why not 
simply write a makefile to build your project?


Regards,
John


Re: problem to create a noinst_LTLIBRARIES shared libraray

2009-04-03 Thread John Calcote

Andreas,

On 4/3/2009 3:26 AM, Andreas Otto wrote:

  I currently writing a java JNI extension used only for local check and
  this library should *not* be installed.
   

...

Question: what can I do to get a shared LTLIBRARIES using the noinst
prefix ?
   


Use check_LTLIBRARIES instead of noinst_LTLIBRARIES.
Check libraries and programs are not installed either.

Regards,
John




Re: Finding library procedures in /usr/local/lib/

2009-04-03 Thread John Calcote

On 4/3/2009 12:29 PM, Ralf Wildenhues wrote:

Hello Gerald,

* Gerald I. Evenden wrote on Fri, Apr 03, 2009 at 08:11:22PM CEST:
   

One added note, that bothers me a little.

If the system checks for an entry being present in a particular iibrary by
compiling/linking a test program using the function *and* linking to the
specified library,  what if the library under test heavily references
another library such as -lm??  IF -lm is not in the test run would the test
not fail???  Thus the entry under test fails also.
 


I haven't read the thread in full, but this is probably the issue
bothering you:

AC_CHECK_LIB and AC_SEARCH_LIBS both have an optional 5th argument where
one can supply additional needed libraries.  So of libfoo needs libm,
then a check for libfoo could look like

   AC_SEARCH_LIBS([function_from_libfoo], [foo], [], [], [-lm])
   
I sure don't know what's happening to my email messages lately. This is 
the third time this month that some response of mine has apparently been 
completely lost by Google mail. I sent this response to this thread last 
night (via Mozilla Thunderbird client), which goes hand-in-hand with 
Ralf's response above, but provides more details:


- - - - - -

I presume that the reason you link with both libproject and the math 
library is because libproject requires libm. This could explain why your 
configure.ac tests are failing to find libproject. Try adding this 
test to configure.ac:


AC_SEARCH_LIBS([any_visible_function_in_libproject],[project],[AC_DEFINE([HAVE_LIBPROJECT])],,[-lm]) 



Tests in configure.ac look for libraries by actually building programs 
that attempt to link the specified symbol from the specified list of 
libraries. If the program links, then the test succeeds. The default 
action in successful cases for AC_SEARCH_LIBS is to add -lprojects to 
the LIBS variable, which is automatically added to your compiler command 
line (at least by Automake-generated makefiles). If the library that 
AC_SEARCH_LIBS attempts to link to requires other non-default libraries 
(like libm, for instance), then you have to add this list of linker 
commands to the other-libraries argument, or the test will fail, even 
if the function is found in the desired library.


The documentation for AC_SEARCH_LIBS indicates that, on successfully 
testing for the desired library, this macro prepends -lproject to LIBS, 
and then executes the shell code in the action-if-found parameter, 
thus, you don't need to add -lproject to LIBS, because this is done by 
the macro before any additional shell code you specify is executed.


You can also use the following macro, which generates shell code that is 
a little less complex. But it's a bit harder to use correctly, as you 
have to write the entire action-if-found functionality yourself. The 
carriage returns are fairly important here:


AC_CHECK_LIB([project],[any_visible_function_in_libproject],
[LIBS=-lproject $LIBS
AC_DEFINE([HAVE_LIBPROJECT])],,[-lm])

AC_CHECK_LIB has no success functionality that executes even if you 
supply the action-if-found argument. All of it's success functionality 
is given by the default value of the argument. Thus, if you supply the 
argument, you have to supply all of the appropriate functionality for a 
successful check. In this case, the macro is supposed to prepend 
-lproject to LIBS, and then define HAVE_LIBPROJECT.


You might also check the config.log file for failed tests that you 
believe should work. Each failed test is listed in full in config.log, 
along with the output of the compiler and linker, so you can probably 
easily see why the test failed.


Regards,
John


aclocal problems

2009-04-03 Thread John Calcote

Automake maintainers,

On page 158, paragraph 3 of the 2.63 Autoconf manual, it states:

If a macro doesn’t use AC_REQUIRE, is expected to never be the object 
of an AC_REQUIRE directive, and macros required by other macros inside 
arguments do not need to be expanded before this macro, then use m4_define.


So the Autoconf manual is encouraging users to use m4_define, however, 
when I define a macro using m4_define in a .m4 file in the m4 directory 
of my project, aclocal ignores it by not m4_including it in the 
generated aclocal.m4 file. It appears to require the use of AC_DEFUN, 
rather than m4_define in stand-alone .m4 files.


Is this a bug in aclocal?

I'm using the latest beta version of Automake - 1.10b.

Thanks in advance,
John




Re: aclocal problems

2009-04-03 Thread John Calcote

On 4/3/2009 5:31 PM, Ralf Wildenhues wrote:

Hi John,

* John Calcote wrote on Fri, Apr 03, 2009 at 09:33:40PM CEST:
   

On page 158, paragraph 3 of the 2.63 Autoconf manual, it states:

If a macro doesn’t use AC_REQUIRE, is expected to never be the object
of an AC_REQUIRE directive, and macros required by other macros inside
arguments do not need to be expanded before this macro, then use
m4_define.
 

...

Is this a bug in aclocal?
 


I don't think so.  Do you think the quote is an encouragement not to use
AC_REQUIRE?  For public macros, I'd even venture to say that they should
be written so they can be AC_REQUIREd (if they don't take arguments), or
at least, that other macros which are expanded inside their contents or
their arguments, may themselves AC_REQUIRE yet other macros which are
then expanded outside of all this mess.
   
Hmmm. No, I don't think it's an encouragement not to use AC_REQUIRE. It 
simply states that if you don't use the prerequisite framework, there's 
no reason to use AC_DEFUN. I supposed that from a certain point of view 
(a rather sarcastic one), it could be saying something like, if ever 
you could conceive of a situation in which you wouldn't need to use 
AC_REQUIRE, then go ahead and use m4_define.


I agree completely with your assessment, but I think the manual should 
make it clear that the only proper way to write a macro is with 
AC_DEFUN, don't you? I mean, if the only way I can write a macro outside 
of adding it directly to configure.ac (which is pointless in all but the 
strangest cases) is to use AC_DEFUN, then *when* would I ever be able to 
successfully use m4_define? I suppose it might work in acsite.m4, as 
that's not included by aclocal.m4.


My only point is that the manual is a bit unclear on this point - almost 
misleading, in fact.


I'd call it a documentation bug at this point. (Eric - comments?)

Regards,
John


Re: Doxygen + Autotools integration

2008-10-03 Thread John Calcote
Stefano D'Angelo wrote:
 2008/10/3 Stefano D'Angelo [EMAIL PROTECTED]:
   
 2008/10/2 John Calcote [EMAIL PROTECTED]:
 
 You may wish to check out my book on Autotools, hosted by Free Software
 Magazine. In there, I make several references to doxygen, and use code
 snippets to implement doxygen targets within the sample projects and
 examples provided in the book.

 http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool
   
 Well... at first sight the whole thing seems a bit simplicistic to
 me, maybe also because I'm generating html, ps, pdf, dvi and man
 output with Doxygen. I'll give it a better look tomorrow maybe.
 

 Ok, read it. I have to congratulate with you for the manual, which
 looks way better than the autobook (and less outdated of course),
 but as said, the proposed solution wouldn't satisfy my needs.

 Other than that, I think you could have used AC_CHECK_PROG instead of
 AC_CHECK_PROGS to look for the doxygen binary, since it already does
 the -z test on the variable used to store its path, thus you wouldn't
 need the AC_PROG_TRY_DOXYGEN macro.
Stefano,

I apologize for my lack of communication here. I didn't mean to ever
imply that my solution was the best one out there. I was simply giving
you some more resources to look at for building your generic doxygen
solution. I only hope it was helpful.

Cheers,
John




Re: Doxygen + Autotools integration

2008-10-02 Thread John Calcote
You may wish to check out my book on Autotools, hosted by Free Software
Magazine. In there, I make several references to doxygen, and use code
snippets to implement doxygen targets within the sample projects and
examples provided in the book.

http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool

I throw these references out into the list sometimes because people come
and go, and every few months, I noticed an increasing number of
questions regarding topics that are mentioned in detail in the book.

Regards,
John

Peter Johansson wrote:
 Hi Stefano,

 Stefano D'Angelo wrote:
 2008/10/2 Peter Johansson [EMAIL PROTECTED]:
  
 Hi Stefano,
 

 Hi Peter,

  
 Have you checked the macro written by Oren Ben-Kiki that is
 available from
 the Autoconf Macro Archive:
 

 Yes (I also named that in my prevoius e-mail -
 http://www.ben-kiki.org/oren/doxample)
 Sorry, I missed that.

 , and I'm not the only one to
 think that it sucks on many fronts.
   
 Well, I have to admit that when I looked at it a couple of years ago,
 I chose to not use it.

 Looks like what I instead created is similar to your example. Will
 have a closer look at your Makefile.am to see if I can pick up any
 improvements. Thanks.


 Cheers,
 Peter



 ___
 Autoconf mailing list
 [EMAIL PROTECTED]
 http://lists.gnu.org/mailman/listinfo/autoconf






Re: Problems with conditional sources

2008-08-26 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Andreas Schwab wrote:
 John Calcote [EMAIL PROTECTED] writes:
 
 Andreas Schwab wrote:
 John Calcote [EMAIL PROTECTED] writes:

 Make is a two-pass utility. The first pass completely assimilates all
 macro data specified in the Makefile. THEN, the second pass generates
 the rule dependency tree.
 This is not true.  Variable refences in target and dependency lists are
 expanded when they are read.  Any later redefinition will not change
 them.

 Andreas.

 This is only true if you use ':=' assignment.
 
 You are mistaken.  RTFM.
 
 Andreas.
 
Andreas,

My mistake. I apologize. I actually have RTFM, but it's been a while. :)

John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAki0IxsACgkQdcgqmRY/OH9MAwCcCxLbm3FiIcwF+nr1T8fiTesW
3VIAn3Lj9yKb/Krord08VTlhZZmuqvLr
=F0Ls
-END PGP SIGNATURE-




Re: Problems with conditional sources

2008-08-25 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

David Sveningsson wrote:
 Hi, I am having some problems with conditional sources. This is what I
 have in Makefile.am:
 
 lib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES = foo.cpp
 if WANT_BAR
 libfoo_la_SOURCES += a.cpp
 else
 libfoo_la_SOURCES += b.cpp
 endif
 
 AM_CPPFLAGS = -I${top_srcdir}/include
 libfoo_la_LDFLAGS = -version-info 0:0:0
 
 I have been reading both autoconf and automake manuals and as far as I
 can see the above should work. However the files (a.cpp or b.cpp) is
 always added at the bottom of the generated Makefile and are therefore
 not used in the compilation. No matter what I try I cannot get even the
 above code to generate a correct makefile but obviously I am doing
 something wrong.

David,

Make is a two-pass utility. The first pass completely assimilates all
macro data specified in the Makefile. THEN, the second pass generates
the rule dependency tree. AFTER the second pass, the tree is evaluated
to see what needs to be rebuilt.

Thus, it doesn't matter where macros are assigned, as value assignments
to macros are managed in the first stage, before any macros are ever
expanded when the dependency tree is evaluated.

Yes, as you've noticed Makefile.am conditional (and non-conditional)
macro assignments are added to the bottom of the generated Makefile.in
(and consequently to the bottom of the Makefile itself), but the point
is that it doesn't matter where they're added. The value of an expanded
macro always takes into account the final assignment in the Makefile.

Regards,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAkizDcwACgkQdcgqmRY/OH8NAgCfXOzWSB8JYAMIqJhC0xCcdj9E
j6wAoJxLNUc1t4YPp0pnn0G6PSCOntpY
=uEnf
-END PGP SIGNATURE-




Re: Problems with conditional sources

2008-08-25 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

 While that would make sense but my problem originates from the fact that
 the conditional files isn't compiled.
 
 After some hacking into the generated Makefile I noticed that the files
 wouldn't be compiled even if I inserted them into libfoo_la_SOURCES
 manually. However, I found that I had to inserting them into
 am_libfoo_la_OBJECTS with the lo extension to make it work. Shouldn't
 automake manage this?
 
 I am just learning autotools so it might be a very simple mistake.

Yes, I'm so sorry. I got so interested in addressing the issue of macro
location in the Makefile, I forgot to answer your original question.

Frankly, from what you've presented, I can't see any reason why you're
experiencing that behavior. You're absolutely right - automake should
(and usually does) handle conditionals without a hitch.

How does WANT_BAR get assigned? Do you have an AM_CONDITIONAL statement
for WANT_BAR in your configure.ac file?

Perhaps you could send along relevant portions of the generated
Makefile. I know they're big, but just clip out the stuff you think is
important.

John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAkizHyYACgkQdcgqmRY/OH++dQCfRlaIjJjHcNN85cjBkkC+frc+
1n8AoIzxdDTw5ghPvqVQ6G/RIBHP+wWD
=7iV/
-END PGP SIGNATURE-




Re: Problems with conditional sources

2008-08-25 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Andreas Schwab wrote:
 David Sveningsson [EMAIL PROTECTED] writes:
 
 Hi, I am having some problems with conditional sources. This is what I
 have in Makefile.am:

 lib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES = foo.cpp
 if WANT_BAR
  libfoo_la_SOURCES += a.cpp
 else
  libfoo_la_SOURCES += b.cpp
 endif

 AM_CPPFLAGS = -I${top_srcdir}/include
 libfoo_la_LDFLAGS = -version-info 0:0:0

 I have been reading both autoconf and automake manuals and as far as I can
 see the above should work. However the files (a.cpp or b.cpp) is always
 added at the bottom of the generated Makefile and are therefore not used
 in the compilation. No matter what I try I cannot get even the above code
 to generate a correct makefile but obviously I am doing something wrong.
 
 Remove the indentation.

Duh. Of course. But the actual answer is to not indent with TAB characters.

John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAkizKagACgkQdcgqmRY/OH8feACglzA/fX3HrTW6VZJgTeuHbg/F
LsUAnj91T+13NdbPMiIanGWHkrQ2kvLp
=qTXY
-END PGP SIGNATURE-




Re: Problems with conditional sources

2008-08-25 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Andreas Schwab wrote:
 John Calcote [EMAIL PROTECTED] writes:
 
 Make is a two-pass utility. The first pass completely assimilates all
 macro data specified in the Makefile. THEN, the second pass generates
 the rule dependency tree.
 
 This is not true.  Variable refences in target and dependency lists are
 expanded when they are read.  Any later redefinition will not change
 them.
 
 Andreas.
 

This is only true if you use ':=' assignment. In this case (only),
variable *assignments* are expanded immediately. But regular rule or
command references are only expanded when the rule or command is
evaluated (as opposed to read).

John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAkizKkoACgkQdcgqmRY/OH+JugCcDG2tUysq3zD5wNFdNMbUC3BS
OAEAoI7g643zoZqOVUafU9grcAOBOmWi
=LoRe
-END PGP SIGNATURE-




Troubles with pkgdata_DATA primary...

2008-06-18 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I have a simple makefile.am in my ftk/docs directory. All it's trying to
do is build and tar up doxygen docs:

ftk/docs/Makefile.am:
- ---
docpkg = $(PACKAGE)-doxy-$(VERSION)

pkgdata_DATA = $(docpkg).tar.gz

$(docpkg).tar.gz: doxygen.stamp
tar chof - html | gzip -9 -c $@

doxygen.stamp: doxyfile
$(DOXYGEN) $
echo Timestamp  $@

CLEANFILES = doxywarn.txt doxygen.stamp $(docpkg).tar.gz

clean-local:
-rm -rf html
- ---

When I hook this Makefile.am into the build system (via
AC_CONFIG_FILES), I get the following from make distcheck:

Output:
- ---
$ cd build; ../configure  make distcheck
...
Making uninstall in docs
make[2]: Entering directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build/docs'
 rm -f
'/tmp/am-dc-2299//home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_inst/share/ftk/ftk-doxy-1.1.tar.gz'
make[2]: Leaving directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build/docs'
make[2]: Entering directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make[2]: Nothing to be done for `uninstall-am'.
make[2]: Leaving directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make[1]: Leaving directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make[1]: Entering directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make[1]: Leaving directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make[1]: Entering directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
{ test ! -d ftk-1.1 || { find ftk-1.1 -type d ! -perm -200 -exec chmod
u+w {} ';'  rm -fr ftk-1.1; }; }
test -d ftk-1.1 || mkdir ftk-1.1
cp: cannot create regular file `ftk-1.1/docs/doxywarn.txt': Permission
denied
cp: cannot create regular file `ftk-1.1/docs/doxyfile': Permission denied
cp: cannot create regular file `ftk-1.1/docs/Makefile': Permission denied
cp: cannot create directory `ftk-1.1/docs/html': Permission denied
cp: cannot create regular file `ftk-1.1/docs/doxygen.stamp': Permission
denied
cp: cannot create regular file `ftk-1.1/docs/ftk-doxy-1.1.tar.gz':
Permission denied
make[1]: *** [distdir] Error 1
make[1]: Leaving directory
`/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
make: *** [distcheck] Error 2
- ---

Do I misunderstand the use of the DATA primary? Perhaps DATA files are
not allowed to be built? I can't find anything in the Automake docs
about required properties of DATA files, and I don't get enough output
from make to determine where this problem is occurring in the Makefile.

One clue - I tried running this command using a -d on the make command
line. Of course, I got tons more output, but one interesting clue was
that make told me just before the failed copy commands above that
destdir needed to be rebuilt...strange.

Thanks in advance,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFIWTUDdcgqmRY/OH8RAvWqAKCICjUY0BsL6TSrB5ErkSgXFuRntQCfbqBU
srMf4jzZXA2r6SHLnjqqOXg=
=cya2
-END PGP SIGNATURE-




Re: Troubles with pkgdata_DATA primary...

2008-06-18 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Never mind ... duh ... I forgot to take the docs directory out of the
EXTRA_DIST variable when I added it to the SUBDIRS variable.

Thus (I'm guessing) the distdir code was trying to create files that
were already in place, and marked read-only by the distcheck code.

John Calcote wrote:
 I have a simple makefile.am in my ftk/docs directory. All it's trying to
 do is build and tar up doxygen docs:
 
 ftk/docs/Makefile.am:
 ---
 docpkg = $(PACKAGE)-doxy-$(VERSION)
 
 pkgdata_DATA = $(docpkg).tar.gz
 
 $(docpkg).tar.gz: doxygen.stamp
 tar chof - html | gzip -9 -c $@
 
 doxygen.stamp: doxyfile
 $(DOXYGEN) $
 echo Timestamp  $@
 
 CLEANFILES = doxywarn.txt doxygen.stamp $(docpkg).tar.gz
 
 clean-local:
 -rm -rf html
 ---
 
 When I hook this Makefile.am into the build system (via
 AC_CONFIG_FILES), I get the following from make distcheck:
 
 Output:
 ---
 $ cd build; ../configure  make distcheck
 ...
 Making uninstall in docs
 make[2]: Entering directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build/docs'
  rm -f
 '/tmp/am-dc-2299//home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_inst/share/ftk/ftk-doxy-1.1.tar.gz'
 make[2]: Leaving directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build/docs'
 make[2]: Entering directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make[2]: Nothing to be done for `uninstall-am'.
 make[2]: Leaving directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make[1]: Leaving directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make[1]: Entering directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make[1]: Leaving directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make[1]: Entering directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 { test ! -d ftk-1.1 || { find ftk-1.1 -type d ! -perm -200 -exec chmod
 u+w {} ';'  rm -fr ftk-1.1; }; }
 test -d ftk-1.1 || mkdir ftk-1.1
 cp: cannot create regular file `ftk-1.1/docs/doxywarn.txt': Permission
 denied
 cp: cannot create regular file `ftk-1.1/docs/doxyfile': Permission denied
 cp: cannot create regular file `ftk-1.1/docs/Makefile': Permission denied
 cp: cannot create directory `ftk-1.1/docs/html': Permission denied
 cp: cannot create regular file `ftk-1.1/docs/doxygen.stamp': Permission
 denied
 cp: cannot create regular file `ftk-1.1/docs/ftk-doxy-1.1.tar.gz':
 Permission denied
 make[1]: *** [distdir] Error 1
 make[1]: Leaving directory
 `/home/jcalcote/dev/prj/flaim/ftk/build/ftk-1.1/_build'
 make: *** [distcheck] Error 2
 ---
 
 Do I misunderstand the use of the DATA primary? Perhaps DATA files are
 not allowed to be built? I can't find anything in the Automake docs
 about required properties of DATA files, and I don't get enough output
 from make to determine where this problem is occurring in the Makefile.
 
 One clue - I tried running this command using a -d on the make command
 line. Of course, I got tons more output, but one interesting clue was
 that make told me just before the failed copy commands above that
 destdir needed to be rebuilt...strange.
 
 Thanks in advance,
 John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFIWTridcgqmRY/OH8RAhuqAJwPLSPxZAzuvkPInVJsDGK+YhZz9ACeMvzQ
xbQKaqiEpXN2yR0XG5eu7J8=
=h69R
-END PGP SIGNATURE-




Re: Makefile.in's in build tree

2008-06-13 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Kamaljit Singh wrote:
 Hi,
 
   I maintain my build tree separate to src tree which has all the 
 Makefile.am's.
 Is it possible that all generated files (including the Makefile.in's and the 
 various cache) be in the build tree ?
 
 kamaljit
 

There are three levels of files here:

1. Level 1 includes files that are generally checked into the source
repository. These are handed edited by the developers. They usually
can't be built any other way.

2. Level 2 includes build system files that are generated or installed
by the autotools. These are also part of the source tree, but not
necessarily part of the final application. They have to stay in the
source tree, because they get packaged up with distribution tarballs.

3. Level 3 includes build products, such as Makefiles, object files,
libraries, programs, etc. These files are built by the user from Level 1
and 2 files shipped in distribution tarballs.

Makefile.in files, configure scripts, auxiliary shell scripts, etc are
autotools-generated or installed build files that are shipped in
distribution tarballs, so they are part of the level 2 file set. As
such, they must remain in the source tree. In fact, they appear to be
part of the source tree to end users who unpack tarballs and find them
there.

Regards,
John

-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD4DBQFIUpUrdcgqmRY/OH8RAoBlAJMHhr9s9WYL9yCLOYkEhLuNq11YAJ0ZethX
JooRaauYTqbk2E8s6Fmp/Q==
=uj34
-END PGP SIGNATURE-




Re: Extra_dist cannot hold two long lists?

2008-06-11 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

A117 wrote:
 problem is, stdafx.* and targetver.* etc. should not go into this library,
 since they are to be used by VC project only.
 I guess they have to go to EXTRA_DIST.
 and, if I put tiny*.cpp to ..._SOURCES, the content in them will be exposed
 in the library, right, which is not what i wanted? I only want them to be
 linked, but not exported. Under windows VC, I can specify which items to be
 exported, but using AutoMake, I donno how.
 did not expect to receive a response so quick. thank you very much.

In Linux, all symbols are exported by default, whereas in Windows, you
must somehow indicate which symbols you want to export, using either
__declspec(export), or a .def file.

In Linux/Unix you can specify exports on a more granular level, but it's
a fair amount of work to do so, and it depends on what you mean by
export, because there are about 3 different levels of export. Check
out the man page for gcc - search for the keyword, export.

Note also that these options are NOT portable. Some compilers and
linkers simply don't give you the option of such granularity.

Regards,
John



-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFIT+FJdcgqmRY/OH8RAiIwAJ9qDx21t+vsWx56lxUEN+X+jM6ClgCfdBC6
LP9Ee7eCqnLKZGhIqIAG7/I=
=Zlof
-END PGP SIGNATURE-




Re: How to define a `prefix' other than the default vaule?

2008-06-06 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Steven Woody wrote:
 Hi,
 
 I want all my stuff by default installed in /usr/local/mypkg unless
 user specified another 'preflex' value when run 'configure' script.
 How can I?  Thank you.


Try AC_PREFIX_DEFAULT([/usr/local/mypkg]) in your configure.ac file.

Regards,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFISWYddcgqmRY/OH8RAsXIAJ9SS1QvfTBYebE2dsp3P7Aw23EEsQCgmhRC
RJyC4S6RrfFfRcHahPV4FnY=
=t2eo
-END PGP SIGNATURE-




Re: preprocessor output target

2008-06-05 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Brian Dessent wrote:
 Note however that if you wish to override a variable on the make command
 in order to add an option, you have to specify the full contents of the
 current value (which for CFLAGS is typically -g -O2 if using gcc and you
 didn't explicitly specify a CFLAGS when configuring.)  If you just do
 make CFLAGS=-save-temps then you lose all the settings that the build
 system might have added, which could be significant.  In those cases
 where CFLAGS is nontrivial I first look at the generated Makefile to see
 what the substituted value is, and then to that I add or remove whatever
 flags I'm interested in.  I wish there was a way to invoke make with a
 += override instead of a = override but this does not exist AFAIK.

This is the primary reason, as I understand it, for the statement made
in the Autoconf manual that these variables (CFLAGS, LDFLAGS, CC, etc),
are User variables, and shouldn't be set to anything by the Autotools.
The default for normal builds is usually something simple like -g -O2.
Autotools scripts should use tool-specific variables, such as AM_CFLAGS,
etc.

Regards,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFISBW4dcgqmRY/OH8RAgEfAKCMYkU5rbvlEh2pZcZoMEquxxqBswCfQd96
PVRnKYjMvd/Ug8fI2+Eo23A=
=CaLJ
-END PGP SIGNATURE-




Re: Building things a little differently?

2008-06-02 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Ralf,

Ralf Wildenhues wrote:
 Automake has no builtin rules for moc files.  So you need to take this
 up with whoever provides those rules.  FWIW, in one package this is what
 we use:

I was wondering how difficult it would be to modify Automake such that
true extensions could be written. For example, Automake has built-in
support for Libtool's LTLIBRARIES primitive. Wouldn't it be cool to
support a type of primary extension file, that would allow one to define
a new type of primary? This file would provide the rules that a new
primary would support, lists that it would update - like the distro file
list, etc.

Just a thought. Would this be particularly difficult?

Regards,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFIRFCYdcgqmRY/OH8RArfmAKCbSTr1HthlH7G9LW84xa2TF+ANVwCfT+Bt
ucIi0QH9wOB7s/xxGivcB90=
=kNzv
-END PGP SIGNATURE-




Re: Automake TESTS question

2008-05-19 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hi Ralf,

Ralf Wildenhues wrote:

 You can use
   check_SCRIPTS = greptest.sh


This is what I finally settled on, based on your answers, and a little
digging...

- - - - - - - - - - - - - - - - -
check_SCRIPTS = greptest.sh
TESTS = $(check_SCRIPTS)

greptest.sh:
echo 'myprog | grep myprog text'  greptest.sh
chmod +x greptest.sh

CLEANFILES = greptest.sh
- - - - - - - - - - - - - - - - -

...which isn't too bad. :) I don't know why the chmod command didn't
work the first time around. It seems that often when something like this
doesn't work and you give up on it, then later you start thinking about
it and gain a bit more faith in your original idea--you try it again,
and presto! it works. Clearly, I had some syntax wrong the first time
that slipped past me.

 Note that some make implementations have a builtin implicit rule to
 update greptest from greptest.sh (which may or may not affect your
 package).


Oh, this explains why Autotools uses install-sh, rather than install.sh.


Thanks,
John


-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFIMZvWdcgqmRY/OH8RAserAKCFTOKf5eg6eW90ejPppTpdxAJ0VQCfR1SM
Ez9zJFy2YLaCo5qNHvw1Elo=
=M60O
-END PGP SIGNATURE-




Automake TESTS question

2008-05-16 Thread John Calcote
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

I've got a simple unit test scenario - I want to run my program and
verify some text on its single output line. In a makefile, I might do this:

check:
myprog | grep myprog text
@echo PASSED!

In Automake, what's the simplest way to do this? I've found this way:

EXTRA_DIST = greptest.sh
TESTS = greptest.sh

But this requires me to:

1) Actually HAVE a one-line script file called greptest.sh as part of
my package.

2) Use the EXTRA_DIST line. Why is this? Can't Automake tell from the
TESTS line that this is a file I need in my distribution?

In my experiments, I've also discovered this method:

TESTS_ENVIRONMENT = $(SHELL)
TESTS = greptest.sh

greptest.sh:
   echo 'myprog | grep myprog text'  greptest.sh

clean-local:
   -rm greptest.sh

But this seems overly complex for a one-liner. In addition, I had to add
the TESTS_ENVIRONMENT line because I could NOT get chmod to set
greptest.sh to executable. The make utility just seemed to ignore the
command, if I added it after the echo line.

The clean-local line was necessary in this case because make distcheck
 complains that greptest.sh was left lying around, and the - in front
of rm was necessary because otherwise make clean will fail if make
check isn't run first.

Anyone know of a more concise way of doing this? Frankly, I'd like to
just run the script inline somewhere, but I can't see a way to do it
with Automake. It wants all TESTS to be real files.

Thanks,
John
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.4-svn0 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iD8DBQFILfX4dcgqmRY/OH8RAsSPAJ4/q1yaj4REwuqJgH5GKyV3ywBZzgCfUfw9
fHr/e/1kvwJzC+DVHDUZZyc=
=eJNi
-END PGP SIGNATURE-




Re: Report to stdout like Linux kernel compilation does

2008-04-11 Thread John Calcote
Stefan,

I asked this very question a few years ago on this list.
Interestingly, my examples came not from the Linux kernel build
process, but from Windows builds, which use a similar output format. I
love this format because warnings and errors are obvious, and yet you
get enough output per file to tell you that something's going on. The
real benefit of this output format is that WARNINGS are obvious.
Often, in standard GNU/Unix/Linux build processes, warnings just zip
right by without much notice.

My question to the list was: How can I get Autotools builds to be
quiet, so I can see the warnings? The response that I got was that I
should just redirect stdout to /dev/null on the make command line.

For the next couple of years, I very was frustrated with this
response. I thought it was a cop-out for just not wanting to provide
the functionality. Then I realized that it was really the Unix way.
You want to see everything so that you know what's going on. When you
want to clean up the warnings (usually something done near the end of
a development cycle), you simply build with stdout redirected to
/dev/null when you run make a few times, and you'll see the warnings
appear, because they're output to STDERR, not STDOUT.

Now -- that said, I really see nothing wrong with my original request,
in the form of an Automake switch. It would be nice to be able to tell
Automake to build Makefiles that generate this sort of output.
Unfortunately, you and I aren't going to get much agreement, I think.

Perhaps, if you were to write a patch to Autoconf, providing a macro
or switch that generates such Makefiles... This also the GNU way. :)

Regards,
John

On Fri, Apr 11, 2008 at 5:42 AM, Stefan Bienert
[EMAIL PROTECTED] wrote:
 Hi automakers,

  probably this is an oftne asked and therefore annoying question ... but is
 there a simple way to get automake to produce a similar output as the Linux
 kernel makefiles?
  Instead of this verbose confusing output like automake makefiles, the
 kernels make produces something like
  [ compile object1 ]
  [ compile object2 ]
  ...
  [ building module ]

  and so on.
  For me it would be nice to have something like a switch in the configure
 script to enable/ disbale verbose/ minimalistic information. E.g. first try
 with min. info, run into an error, therefore rerun with extensive info,
 search for the min. statemnt on which the error occured and voila!
  Like this:
  ./configure --disable-verbose
  make
  [ compile object1 ]
  [ compile object2 ]
  error on building object 2!

  ./configure
  make | grep
  [ compile object1 ]
  if gcc ... and so on
  [ compile object2 ]
  if gcc ... and so on
  error...

  I hope somebody can understand what I want to do.

  greetings,

  Stefan

  --
  Stefan Bienert
  Zentrum fuer Bioinformatik
  Universitaet Hamburg
  Bundesstrasse 43
  20146 Hamburg
  Germany

  Email: [EMAIL PROTECTED]
  Phone:  +49 (40) 42838 7345
  Fax:+49 (40) 42838 7332







Re: Report to stdout like Linux kernel compilation does

2008-04-11 Thread John Calcote
On Fri, Apr 11, 2008 at 10:36 AM, Robert J. Hansen
[EMAIL PROTECTED] wrote:
 Bob Proulx wrote:
  Always build with full warnings enabled.  Always clean up warnings as
  they are introduced.  Always keep a warning free build.
 

  Given the wide leeway the C and C++ standards give implementors in respect
 to what is warning-worthy, I think this is crazy.  There are far more
 compilers in the world than just GCC, and trying to make a warning-free
 build on /every/ platform is just far too much torment-of-the-damned for far
 too little benefit.

Hmmm. I'd have to disagree here. I carefully consider every warning I
see, and evaluate whether or not it represents a real problem. While
there have been versions of various compilers in the past that like to
warn about stupid things, compiler writers understand the language
better than most people. If they warn me about something, then I at
least give it the credit it's due. Often, these warnings indicate a
real or future problem in my code.

Interestingly, as my proficiency with the language (C or C++) has
grown, the number of warnings I deal with has decreased. I still get
errors, from stupid type-o's, but I don't get that many warnings
anymore.

John




Re: Report to stdout like Linux kernel compilation does

2008-04-11 Thread John Calcote
On Fri, Apr 11, 2008 at 3:00 PM, Bob Proulx [EMAIL PROTECTED] wrote:
   When you want to clean up the warnings (usually something done near
   the end of a development cycle), you simply build with stdout
   redirected to /dev/null when you run make a few times, and you'll
   see the warnings appear, because they're output to STDERR, not
   STDOUT.

  Since you disagree with my statement challenging this then does that
  mean that you agree with the strategy I was challenging?  That is,
  don't check the project build warning status until you are near the
  end of a development cycle and _then_ start to address warnings?  I
  am sorry but that type of strategy triggers an immune response in me.
  I strongly believe this is not a good strategy and am not shy about
  stating my opinion on it.  Don't let broken windows accumulate.  Fix
  broken windows as you go along so that the project is always in as
  good of a state as possible all of the time.

  Every time I compile on a new platform I look at the warnings
  generated.  If my native compile environment isn't smart enough to to
  generate a valid warning but another platform is smart and generates a
  previously unknown valid warning then I am not going to ignore it.
  Every new environment that my code reaches almost always teaches me
  something useful about my code.


I'm sorry - I was a little confusing here. I don't actually work this
way either. I'm a stickler for details, and I can't really abide
warnings being left around for weeks or months - it's the
Type-A'ed-ness in me or something. Anyway, what I meant was that my
development cycles are fairly short - I might not get around to
checking for warnings (by redirecting stdout to /dev/null) for a few
days, so I might let a few warnings go for a week (or less), if I'm
busy working on a task.

But for this very reason, I think it's a good idea to have clean
output. I've heard all the arguments before about not being able to
see the cause of the warning because your command line is obscured by
a pretty-printing build system. I like to see full output, AFTER I've
discovered there's a problem. In which case, I'd then go to the build
log file to see the FULL output. I just don't see why we have to watch
complete build information go by with a build that has no warnings or
errors - which is supposed to be the usual case - it's the exceptional
case where you need to see full output.

As far as setting up your editor to take you to the next warning or
error on command - now that's a cool idea - I love how this works in
the M$ IDE on my Windows projects.

John




  1   2   >