Make make check abort on memory leaks or errors

2014-07-10 Thread Steffen Dettmer
Hi,

I have s small C++ test project using autoconf, automake and CppUnit.
I like to check for memory errors, starting with memory leaks.
I started with ./configure CXXFLAGS=-fsanitize=address, which
detects some memory errors (e.g. use-after-delete), but no memory
leaks.

After searching the web for a while, I thougt to fall-back to valgrind
using make check TESTS_ENVIRONMENT=valgrind. Althrough valgrind
./mytestrunner works (test program aborts because of memory leak),
make check TESTS_ENVIRONMENT=valgrind does not work, it errnously
returns status 0, no memory errors are reported. Maybe valgrind
checked the automatically generated automake test-driver script for
memory leaks?

Since I assume checking memory usage errors is a coomon requirement, I
think there could be a common solution, but I failed to find it. Could
anyone please advice how to check for memory leaks?

A pragma solution specific/limited to g++ (4.8) on Linux (e.g. Lubuntu
14.04) without libtool would be sufficient.

Regards,
Steffen



Re: GSoC project idea: non-recursive automake project

2011-03-21 Thread Steffen Dettmer
On Mon, Mar 21, 2011 at 7:36 PM, Roger Leigh rle...@codelibre.net wrote:
 Can't automake rewrite the relative paths to be absolute?

This would break things, for example when using WINE via wrapper
scripts, require fixed srcdir pathes...

oki,

Steffen



Re: PKG_CHECK_MODULES on system without pkg-config installed?

2011-03-11 Thread Steffen Dettmer
On Thu, Mar 10, 2011 at 1:03 PM, Roger Leigh rle...@codelibre.net wrote:
 [...]
 This is not meant to sound like a troll, but: is anyone really
 *really* using static linking in 2011?

Yes, in my company we link almost all our own libraries
statically to our own applications. (however, we use some own pkg-config)

(on linux, we link libownlib.a but not -static)

We do so for several reasons. For some embedded devices dynamic
linking is no real option for applications or can have other
disadvantages. For linux packages, static linking helps to ensure
that a tested and qualified binary behaves correctly, even if the
target system has different lib versions. So usally we link our
libs statically and the system libs dynamically and, if in doubt,
ask for specific versions (linking everything statically can lead
to e.g. LGPL licensing difficulties, AFAIK).
When having a small lib fan-in, let's say just needing glibc or
so, you can even install the resulting binary (or RPM/deb/xxx
package) on a differnt linux version / distribution, because
usually the used parts of the library interface is really stable.
That's really great. Very highly skilled professionally
sustainable working experts developers who build GNU/linux
(including libs, cc tool chain, autotools, distro standards and
so much more) made this possible. It is fun working with it.

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-17 Thread Steffen Dettmer
* On Sat, Jan 15, 2011 at 12:08 AM, Paul Smith psm...@gnu.org wrote:
 * On Fri, 2011-01-14 at 19:57 +0100, Ralf Wildenhues wrote:
 http://gittup.org/tup/build_system_rules_and_algorithms.pdf.

 No idea whether they are standardized somehow or somewhere.

 http://www.mail-archive.com/help-make@gnu.org/msg08500.html
 I think some of that discussion was quite illuminating.

Yes, it is interesting reading.

It has quite outstanding features, for example it could tell
where parallelization depedencies are missing or wrong and
probably even /what/ is wrong. As far as I know there are build
systems (ElectricAccelerator) that use such information to fix
such issues (rebuild the file when inputs changed to get the
correct result and show up some warning).

When having a source three constructed of several (sub-)
packages, how does a Beta-Build system looks like? Could there be
tupfiles including sub-tup-files?

What influence has the choice of a Beta-Build to the
maintainability of such a sub package? Can they still be
orthogonal to each other?

How portable can a Beta-Build system be? Isn't it requiring
specific extentions to watch the file system by some event bus or
such?

Is it simple, stupid?

What happens when changing files during a build? In the example
with X, X' and Y, will Y ever show up? Wouldn't a Beta Build
include those new changes instantly? (I'm not telling that this
would be a disadvantage).

How does it work? By overloading libc functions? Or does it even
need a kernel module? Access via network file systems such as NFS
works?

I think a good Beta-Build-System could have some portability
mode used when no file system watching works; it could fall-back
to a Alpha-Builder, is this right? Would this require to
distribute some generated depedency database files or so to make
the fall back know needed things which are not in a tupfile?

I wonder whether something like a Beta-Build-System is the
future of building or if there are applications/projects or at
least niches where make will persist.

Ohh, this was quite out of topic, but an interesting topic.

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-17 Thread Steffen Dettmer
On Mon, Jan 17, 2011 at 6:31 PM, Paul Smith psm...@gnu.org wrote:
 On Mon, 2011-01-17 at 17:28 +0100, Steffen Dettmer wrote:
 When having a source three constructed of several (sub-)
 packages, how does a Beta-Build system looks like? Could there be
 tupfiles including sub-tup-files?

 What influence has the choice of a Beta-Build to the
 maintainability of such a sub package? Can they still be
 orthogonal to each other?

 I don't offhand see any reason why a Beta build system couldn't
 use the same makefile syntax as make, even GNU make.  It seems
 plausible that the parser which creates the DAG, etc. could be
 kept basically as-is.  However, I think the walk the DAG,
 build decision parts of make would have to be tossed and
 completely re-done.

I was just wondering what happens in a parallized build, let's
say when component B uses several headers from component A which
are automatically generated, but component B should not need to
know anything about component A's internals. With make, someone
could have some a/build.mak and b/build.mak and have the Makefile
include both (would this be the correct approach?).
Would be a problem if component B would need to tell (to know)
that component A's header depends on some project configuration
option or so.

If I understood correctly, there were attempts to make make a
beta-builder by adding such functionality, this probabaly used
the same Makefiles which would be a nice advantage, because it
could make betagmake a drop-in replacement. I leaves open how
suited Makefile syntax is for a Beta-Builder.

I think a good Beta-Builder would not need to get any
dependencies hand-written in some Betamakefile, but could derive
all this automatically. In first runs this might require to
rebuild targets (in the case an input was found changed: the
dependency could be added automatically and the target could be
rebuilt to fix this build error automatically). The Betabuild
would learn all dependencies by watching what the subprocesses
(triggered by the building: like compiler, linker, source code
generators etc) open as input (read). If gcc main.c opens
parser.h, it would know that main.c depends on parser.h - no
need for any makedepend anymore; deps would fully support all
conditionals, source code generation and no one could forget a
dependency.

Is this correct or is this impossible to archive?

 How portable can a Beta-Build system be? Isn't it requiring
 specific extentions to watch the file system by some event bus or
 such?

 That would be one way, and is certainly the most efficient.  Another
 would be to simply stat all the files up-front; that could be done on
 any system.  Obviously this is much more time-consuming.  On the other
 hand you can avoid a lot of rule matching/searching/etc.

I didn't understand what stat all file up-front means or how to
archive it. Do you mean stat'ing all files?
When instead of watching calling stat() on each file to find out
whether it was changed, wouldn't this be O(N) and thus an Alphabuilder?

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-14 Thread Steffen Dettmer
On Thu, Jan 13, 2011 at 8:39 PM, stefano.lattar...@gmail.com wrote:
  ``I truly dislike the idea of not keeping configuration and build
   steps separated.''

 Maybe I'd just like a system that *allows* me to keep configuration
 and build steps clerarly distinct if I want to.  Yes, that would
 be enough for me I guess.

Maybe it is just a matter of terms, but I think the concept of
configuration (as software configuration) is only about
deciding which configure option to set or not to set; the
configure run itself IMHO does not need to be separated from the
rest of the building.

 Wouldn't it be great to type make which automatically knows by
 depedencies that some configuration rules have to be executed
 (i.e. to determine facts about the environment if they are not
 available in form of small .h files or alike)?

 Yes, but then, this could be implemented by having the build system
 call the configuration system properly, no?  More or less like is
 done by automake-generated rebuild rules, just on steroid I guess.

Yes, except that automake-autoconf rules implement a phase or
stage idea: when a configure run is needed, it is
performed entirely. So it can be seen as separated where the
build system can trigger the configure run.

 If, for example, Makefiles would have rules to check for the
 libraries as soon as needed etc, wouldn't this be good?
 Tests that are not needed for the configuration to be built
 would not even be executed (saving time).

 What do you mean exactly by this?

I'm not sure if this makes any sense, but I could imagine that if
some file conditionally (enabled by some
configure-as-in-software-configuration option) uses some feature
which in turn depends on a platform function that has to be
checked, then in this moment this single test could be performed.
Let's say I used some --enable-tcp switch. The build system finds
that in this case it needs tcp.o. By some depedency tcp.o depends
on some have_socket_h.check_result file. The creation rule for
this file invokes a test for socket.h and stores the result,
which is used by tcp.o building in some way.

If no networking would be used, this check would not even be
executed. The test result is just an input like a BUILT_SOURCE or
so.

Maybe this would work in some way.

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Steffen Dettmer
On Thu, Jan 13, 2011 at 3:39 AM, Bob Friesenhahn
bfrie...@simple.dallas.tx.us wrote:
 While GNU make is a really good 'make' program, I think that 'make' in
 general is a flawed concept.

Could you please explain this a bit?

I like the `make' concept; in some situations I even love it.
One example of such a situation is to create statistics from
productive log files.
This is an incremental, time consuming process. I wrote the stats
script in Makefile language (starting with a shebang line).
Intermediate results are stored as files (an automatic cache).
The process is very time consuming but thanks to make, it can be
aborted and restarted at any time (just the current file action
is lost). When restarted later, only the new files are processed
etc. The Makefile is very simple, it just has the rules for each
step, no need to worry how to get this in the right order, no
need to worry about parallelization or where to continue after
interruption. It just works.
(this has no relation to automake of course).

 If there was going to be a revolutionary change, then I would
 like to see a small embedded build engine be included which
 accomplishes the few functionalities needed from make, but also
 avoids needing additional programs.

I think the need for small dedicated additional programs often is
very useful, because by using the simple building blocks, quickly
complex solutions can be constructed, even if no one ever had a
related requirement ever.

In contrast to let's say ant, a build tool popular in Java
world, does this differently. This is good for
platform-independent java building, but has IMHO serious
disadvantages and IMHO the big problem that it can do only what
was built-in in advance. You can create .jar files but you
cannot use source code generation or even use some other packager
to bzip2 some output result in the end. Even compiling-in a
version number can become tricky.
(Trying to build C code with ant seems to be a bad idea anyway).
Surely I'm not capable to use ant correctly.

Maybe a future version of automake creates an efficient
GNUMakefile and a less efficient but more portable Makefile at
the same time, leaving it up to the user to select which one to use.

 A little bit of analysis will reveal that Automake really does
 not require much functionality from 'make'.  Probably 5% of
 what GNU 'make' is capable of.  I don't like it that 'make'
 depends on file timestamps, and that it is unable to cache what
 was used to create the current binary, and that dependency
 information from the build system needs to be cached in 'make'
 include files.

What is the problem with timestamps?
I don't like that ant does not depend on timestamps (leading to
rebuilt jars, resigning, which takes time and needs entropy etc).

Where else should dependency information be stored? Of course, if
you would have a special make-replacement dedicated (limited?) to
C building, it could be smart, but I'm afraid the smart
things fail more often that the simple things.
I think the dependency issue arises from a kind of lazyness
(generate the deps automatically instead of writing them
explicitely). To save that for C code, there are tools for it
(gcc -M).

Do you mean that make is a good multi-purpose tool but not well
suited for C building, especially in automake environments?

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Steffen Dettmer
On Wed, Jan 12, 2011 at 10:36 PM, stefano.lattar...@gmail.com wrote:
  - I think that keeping configuration and build steps separated is
   a very good idea.

Do you mean this is a good idea in the context of todays systems
- or -
Do you mean this is good idea in general and could be a design
criteria for future build environments?

I think I agree to the first (mostly because I assume if the
autotools developers and experts separate those steps, they do it
for a good reason), but I don't understand my this could be a
requirement in future systems.

Wouldn't it be great to type make which automatically knows by
depedencies that some configuration rules have to be executed
(i.e. to determine facts about the environment if they are not
available in form of small .h files or alike)?

If, for example, Makefiles would have rules to check for the
libraries as soon as needed etc, wouldn't this be good? Tests
that are not needed for the configuration to be built would not
even be executed (saving time).

What important points did I miss in my consideration?

oki,

Steffen



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-12 Thread Steffen Dettmer
2011/1/12 Stefano Lattarini stefano.lattar...@gmail.com:
 I'm starting to think that
 automake should *really* start supporting *only* GNU make (at least
 from version 3.75 or so).

I think also bash, gcc and most GNU tools are widely avialable.
They could be built using an old fixed automake.
But where should this end?
If only the MyLinux version from yesterday is supported?
What is wrong from using a ten year old u*nx machine as long as it works?
(ok, my has GNU Make 3.79.1 so I would be fine SCNR)

oki,

Steffen



Re: build the same source twice with different macros

2010-11-15 Thread Steffen Dettmer
On Mon, Nov 15, 2010 at 5:11 PM, Nicolas Bock nicolasb...@gmail.com wrote:
 I have some functions written in C that take a floating point argument, e.g.

 void foos (float x);
 void food (double x);

 The function bodies are basically identical except of course for the
 different floating point types. In order to avoid having to write
 redundant code, I see 2 options:

 (1) I can use C++ [...] template.

If I could use C++, I would do it :)

 (2) I can define a macro for the preprocessor that is either defined
 as float or double and then compile the function source twice, the
 first time with $CC -DFLOAT=float and the second time with $CC
 -DFLOAT=double.

I think this looks complicated.
If development rules allow it, I think someone could try for
small/short implementations:

--[foo.c]--8===
#define gen_FOO(FUNCNAME, TYPENAME) \
void FUNCNAME (TYPENAME x) \
{ \
   code; \
   code; \
}

gen_FOO(foos, float);
gen_FOO(food, double);
===8---

or when gen_FOO would be too big (e.g. when needing compilers
just accepting 10 lines macros or so) maybe:

--[foo.inc]--8===
void FOONAME(FOOTYPENAME x)
{
  ...
}
===8---

--[foo.c]--8===
#define FOONAME foos
#define FOOTYPENAME float
#include foo.inc
#undef FOONAME
#undef FOOTYPENAME

#define FOONAME food
#define FOOTYPENAME double
#include foo.inc
===8---

or some source code generation:

--[Makefile.am]8===
# just to illustrate the idea, rule surely is wrong:
foos.c: Makefile foox.in
perl -np -e 's/FOONAME/foos/ ...'  foox.in  $@
food.c: Makefile foox.in
perl -np -e 's/FOONAME/food/ ...'  foox.in  $@
===8---

(here we typically use source code generation; often with a
dedicated perl generator script creating all related functions
inside a single .c file).

oki,

Steffen



Re: Force a file to be compiled always

2010-11-10 Thread Steffen Dettmer
On Nov 4, 2010, Benjamin Bihler benjamin.bih...@twt-gmbh.de wrote:
 As to the third suggestion: I use the __DATE__ and __TIME__
 macros in my code as a kind of version information. Therefore
 the compilation result differs with every compilation, although
 my source file does not change. Is there yet a better method to
 store the compilation time stamp in a library without fiddling
 with make targets?

We do fiddle with make targets, but in this way:

--[Makefile.am]8===
some_SOURCES=app.c
# ensure to compile-in the current date
app.$(OBJEXT): $(LIBDEPS) $(l...@mainmodulename@_a_SOURCES) Makefile
===8---

app.c includes some

--[app.c]--8===
   const char *const version = SYS_VERSION
#if defined(DEBUG)
(DEBUG), compiled  __DATE__   __TIME__
#endif
   /* non-debug (but release-) versions are guaranteed to have a
* unique dedicated SYS_VERSION */
   ;
===8---

I'm not sure if this is best (correct), but seems to work well.

I think the big advantage over .PHONY is that is does not
re-genereate the binary (a new binary, actually!) if nothing was
changed at all, which IMHO would be a bad habit.

oki,

Steffen



Re: Automake and Texinfo: clean the info or pdf file

2010-09-01 Thread Steffen Dettmer
On Tue, Aug 31, 2010 at 7:57 PM, Eric Blake ebl...@redhat.com wrote:
 [...make distcheck' does a good job of...]
 separating maintainer issues (the 'make dist' portion) from the end user
 issues (the VPATH 'make check' build from a read-only srcdir portion).

thanks for the detailed explanations and clarification!

Steffen



Re: Automake and Texinfo: clean the info or pdf file

2010-08-31 Thread Steffen Dettmer
On Mon, Aug 30, 2010 at 7:54 PM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 * YuGiOhJCJ Mailing-List wrote on Mon, Aug 30, 2010 at 05:41:40PM CEST:
 I work on a project which use automake and include a
 documentation in Texinfo format.

 If I call :
 $ make
 The .info file is built.

 In the source tree (right?).
 And it will be distributed (with 'make dist').

make default target /creates/ files in the source directory?
But isn't this wrong and verified by making srcdir read-only in
`make distcheck'? Thought it would only /update/ and only in
`maintainer cases' (such as changed configure.ac).

Documentation states `In fact VPATH builds are also a means of
building packages from a read-only medium such as a CD-ROM'
(but surely meant from a src tarball where the .info will would
be included and up-to-date anyway).

Wouldn't this break things (e.g. in VPATH leading to a .info in
src and builddir?)?

Steffen



Re: call for help/crazy idea: nmake support

2010-08-18 Thread Steffen Dettmer
Hi!

This is an interesting discussion. I think a key question is
whether the style of working with Integrated Development
Environments (IDEs) is compatible with `orthogonal component
based environments'. I tend to think that both are, more or less,
each others opposite.

In first case, I have a one-fits-all tool (like Eclipse).
Everything included is specific (Eclipse-editor, eclipse-etags,
eclips-debugger etc), which is needed to integrate the things.
Advantages include that the debugger knows tags and you can set
breakpoints in the editor window etc. but it might be hard to
replace a part of it (let's assume someone wants to use a
different editor - impossible without losing the great features).

In second case, several small tools (the simpler, the better) are
used together and each can be exchanged separately. Use any
editor, any compiler and any debugger. You can use source code
generated e.g. by self-written parser generators and implement
parts in own languages, if desired.

If I understood correctly, nmake support and MSVC *.DSP project
file support aim to build a brigde between both cases.
I have difficulties to imagine how it could look like, especially
when considering the non-trivial cases.

On Sat, Jul 31, 2010 at 7:26 PM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 Here's a crazy idea: how about if automake optionally output an input
 file suitable for nmake (after configure substitution)?

What would be the intended main usage?
Building C/C++ with CL.EXE only?
What would be the motivation?

I assume it is for systems that cannot run configure, have no
make and have no special requirements (only C/C++, no
BUILT_SOURCES etc).

 Is that even feasible?  (I'd guess so)
 Maybe if we have contents conditional on 'make' or 'nmake' output?

Alternatively, some special make rules could create nmake
makefile and include it in the source dist.
Depending on the simplifications / limitations acceptable for the
nmake makefile, maybe it could be constructed from a static
template (to be adjusted on major changes) but just honor
the _SOURCES variables dynamically.

We use a set of GNU Make rules to get the _SOURCES into a file
used to form MSVC6 DSP files; I think creating a nmake file
shouldn't be more difficult.

 Would that even help anybody?  (no idea)

MS Visual Studio users probably would gain no benefit; if they
use make or nmake AFAIK does not matter much. Having .DSP files
can add benefits. The debugger also works without .DSP files, but
the IDE won't (of course it needs to know the sources to index
symbols etc).

 Is there anybody willing to work on this?
 Should I prepare a Summer of Code/other code sprint project proposal[1]?
 Or is a better alternative to follow the path we have taken with Libtool
 (finally getting MSVC support in) also in Automake, with Peter's patches
 and more?

I know developers that love to use the MS Visual IDE and are
quick with it. They are glad even for somewhat limited DSP file
support (in our case, BUILT_SOURCES won't work and require some
special manual action, but compiling etc works).

Personally, I dislike this (which is easy for me, because I do
not use Visual Studio - otherwise, probably I would see things
differently :)). If BUILT_SOURCES do not work comforable, people
then avoid using it. In the end #defines aren't in config.h but
in mymodcfg.h etc.

I don't know Peter's patches, but I'm afraid that the working
style of autotools and IDEs differ and thus DSP file generation
could even be contra-productive (e.g. when resulting in the
habbit not to use BUILT_SOURCES, beause some platforms do not
support it, but of course it is no objective of autotools to
enforce some good style developments or so).

For people only using a (source dist) package, it shouldn't
matter a lot whether to use DSP files or Makefiles, should it?

oki,

Steffen



Re: appending to DEPENDENCIES

2010-05-18 Thread Steffen Dettmer
On Mon, May 17, 2010 at 8:02 PM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 adding of another variable to the default list of
 foo_DEPENDENCIES. I guess. Suggestions for naming such a variable?

foo_EXTRA_DEPS?

oki,

Steffen



Re: Built-in target to delete all generated files

2010-04-30 Thread Steffen Dettmer
Hi,

why would someone want to check in derived files like configure
and Makefile? Because someone might not have autotools? Why not
also checking in objects and libs in case someone might not have
a compiler installed?

On Fri, Apr 30, 2010 at 5:57 AM, Trevor Harmon tre...@vocaro.com wrote:
 For this discussion I think it's important to recognize that
 there are two distinct classes:

 1) People who download a source tarball
 2) People who check out the source from a repository

Just for sake of completeness / for the bigger scope I'd like to add:

3) People who use a binary distribution

I think this is the biggest group of users.

Often they get the bin dist from a package maintainer who in turn
works on a src dist, which might be configured specifically
and she may have patched and/or extended the src dist.

 #1 are the users or end users who just want to download,
 build, and install a stable release. They typically don't care
 about getting code from the repository.

Yes, I agree, for those the source dist is for and thanks to
autoconf it is done in a way that it can be built easily. Of
course, you must have installed make, a compiler...
The documentation should tell what exactly is needed.

 #2 are the developers who are testing prereleases, hacking on
 the code, contributing patches, etc. This usually requires
 checking out the source from the repo. For these people I don't
 see any problem with requiring them to install autoconf = 2.59
 (or whatever) and to run a single command (autoreconf -i) to
 bootstrap the build. They most likely have it installed already
 anyway!

Yes, I completely agree.
They have to have installed make, a compiler, autotools...

How else should they be able to add a new source file?

Also they have to be able to use their own versions of the tools.

 I realize there may be some overlap between the two groups --
 for example, power users who just want to try out
 cutting-edge code

I guess most power users do have autotools installed :-)

Personally, I dislike to have derived files in version control.
It is redundant, not DRY. Also can invite trouble, for example,
it could happen that the checked-in configure does not match the
configure.ac or so.

Are there SCMs that don't support file ignore lists? SVN has its
properties stuff, might be a bit clumpsy for
non-ant-java-mainline projects to add all the files, but most
work is just needed once by one of the developers of the team, so
shouldn't be a problem I think :-)

To get rid of all the autogen files and to check if the current
state is reproducible from VCS a simple way is to simply check it
out and build it. If there are no derived file in the repository,
all files have to be generated and if anything is missing or
wrong, at least make check has a chance to spot it (otherwise,
someone may have extended let's say configure.ac, checked in
configure but not configure.ac and noone would notice, then
second one changes configure.ac and then you wonder why configure
suddenly fails or so).

oki,

Steffen




Re: Minimal example for perl program

2010-04-20 Thread Steffen Dettmer
On Tue, Apr 20, 2010 at 9:23 AM, Russ Allbery r...@stanford.edu wrote:
 It breaks the basic assumption that Makefile.am is basically a
 makefile.  I suppose that Automake could try to transform the
 whitespace as part of its processing, but I'm not sure that's a
 good idea.

I even think it would be a bad idea because unless implemented
very smart and intelligent, it could break some constructions
like `inline-scripts'.

There is this special handling of escaped line endings (i.e. \
at the end of a line) that in make leads to escape the linefeed
but without removing the \ (unlike all other escapings I know).
So when you have:

---8===
test:
perl -e 'print multi line\n\
continued here\n'
===8---

I think someone could expect:


---8===
stef...@host-with-make-3.79.1:~/work # make
perl -e 'print multi line\n\
   continued here\n'
multi line
   continued here
===8---

but what happens is

---8===
stef...@host-with-make-3.81:~/work # make
perl -e 'print multi line\n\
   continued here\n'
multi line

   continued here
===8---

so someone may write:

---8===
test:
$(SCRIPT)

SCRIPT=perl -e 'print multi line\n\
   continued here\n'
===8---

so an `automatic re-tabify' would need to be quite intelligent I think...


oki,

Steffen




Re: Regarding the JAVA primary

2010-04-19 Thread Steffen Dettmer
On Mon, Apr 19, 2010 at 8:25 PM, John Calcote john.calc...@gmail.com wrote:
 [...]
 Builds in the Java world generally specify source files found
 within a subtree using a globbing mechanism, with optionally
 specified inclusions and exclusions.

Yes, they do. BTW, does anyone know why?

With some sarcasm someone could tell that it is done in this way
because with Java you need to make heaps of files (e.g. one for
every public exception), but maybe it has a good reason?

We use some very old and surely bad custom automake rules to
compile java sources. Ages ago we also had some wildcard (`find')
rules (inspired by ant assuming `this would be the good way to
go'). Those rules collected the files, but we changed them to use
a list of file names, which seemed much cleaner and solved some
issues (which of course could had been solved in other ways,
too).

One motivation for this change was that our make rules that
generated the exception source code files (many differing more or
less just in the name and some `String name ...') and people
forgot to add the files, so if existing for them they had been
built but not for others where the files were not updated (and
the new gensrc files were missing). This resulted in an
incomplete jar file and unfortunately when using this jar a
compilation error was flagged out in the wrong package... unit
tests did not helped because of course also missing in the
incomplete jars (and the make check scheme we used also used a
wildcard approach to collect the unit tests to be executed).
Strange things happend when switching between branches (where
typically the number and kind of exceptions changed etc).

I disliked that when by some problem almost all sources would get
lost in a sandbox (e.g. if switching to a bad / incomplete tag),
make (and even make check!) succeeded.

On `tree conflicts' (one changed a file, another moved it) it
could even happen to have two times the same functionality in a
jar...

To build a list of files why not open Makefile.am in $EDITOR,
like vim or emacs, and insert the file list here (in vim, you may
start a line `java_source = \' and on the next blank line use
`!!find . -name *.java -exec echo {} \\ ;'
which works except for the last file, which ends with a `\'.). No
need to make this at make run time, or is there any?

By this, cvs diff (or whichever SCM tool is used) easily shows
the included files which is easier to review I think.

Using wildcards IMHO means to logically `include' the directory
contents to the Makefile. I think you cannot even use it as
dependency (and thus I'd guess the jars would be resigned on each
make run, even if no file changed?).

What is the advantage of using find magic and make time?

How do you handle your java.in files?
How are you safe to get old generated files out the jar (if
removed from gensrc, they still are in builddir and the make
clean rule may not even take it any longer - how to notice?).

I'm afraid the find/wildcard approach only works for simple
builds - but those could also be done by ant I think...

 I'm not stealing this concept entirely from ant.

(yeah strange tool that, but different topic :-))

oki,

Steffen




Re: Include directive for all generated Makefile.in

2010-04-15 Thread Steffen Dettmer
On Thu, Apr 15, 2010 at 8:02 AM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 * Steffen Dettmer wrote on Wed, Apr 14, 2010 at 11:53:56AM CEST:
  [on this idention level]

 Well, can you give a specific example?  I can probably see that
 this might be useful, but having a convincing example always
 helps.  Really, designing new interfaces should be done right.

Yes, of course you are right and ideas may help, I just hope you
don't waste to much time in case I write about ideas that are
simply too far off and wrong :)

 We could easily have a fairly fat interface
  AM_MAKEFILE_INCLUDE([fragment], [placement], [pattern], [id])

 where fragment names the file to include, placement is 'top' or
 'bottom' or so, pattern matches the Makefile.am files to
 affect, and id a string or number which could mean that a later
 fragment with the same id will replace the earlier one.

(hum... I'm not sure what this would allow to do... anyway.)

 I am just not seeing how it can be useful.  All the decisions
 of which fragments to include where _still_ have to be done at
 the time automake is run, the only variation postponed to
 configure run time you get is conditionals and @substitution@
 expansions (as usual) within the files.

(I try to give some examples, but probably they are bad ones)

The etags program on many of our build hosts have a bug (fixed in
meantime) leading to an invalid tags file when hitting some
special input (some struct syntax). Makefile uses some awk
construction to generate the list of files. As workaround someone
propose to add some if in the awk script to simply skip the
related file. But AFAIK there is no way to run some `find and
replace' on rules from Makefile.in.

Compiler and linker of one toolchain may leave zero-length files
in case or error / interruption (on next make leading to errors).
As workaround someone could imagine to `customize' the
$(mylib_a_AR) rule by appending some || rm -f $@.

Don't know if appropriate here (probably not) but someone may
wish to have a specific error code for a place of failure, let's
say $(mkinstalldirs) ... || exit 45 to get exactly this error
code in case this mkdir fails.

Once there was a bug that if OBJEXT = .obj, sources from subdirs
won't compile (i.e. `xyz_SOURCES = a.c subdir/b.c' lead to a.obj
but no b.obj).

There are compilers not supporting `-o object.o' leading to
trouble when using xyz_CFLAGS - but this is better to be solved
by making wrapper scripts that add support for -o.

The dist rule (for at least older automake) has `$(AMTAR) chof'
which fails for long file names. We wanted to remove the `o'. We
had to write a wrapper script `lazytar' which internally drops
the 'o' option.

For some reason on cygwin ...gzip -c ... wasn't working because
for some reason regardless all binary mount options and
environment variables CRLF conversion took place on piping
resulting in broken zip files. We wanted to try to replace
$(AMTAR) chof - $(distdir) | GZIP=$(GZIP_ENV) gzip -c $(distdir).tar.gz
by
$(AMTAR) chof - $(distdir)  $(distdir).tar \
   GZIP=$(GZIP_ENV) gzip -c $(distdir).tar


  Probably all this are examples of DONT-DOs because relying on
  internals that change without prior notice.

 We are purely speaking about new, to-be-public, user interfaces
 here.  Things that users should be able to use with Automake
 1.12.  None of the above applies to current Automake.

Yes, right, but I meant that the `customizations' would be DONTs
for example because accessing variables / identifiers that are
not considered to be part of the documented API (such as
$(distdir) or relying on the structure of some mkinstalldirs
rules).

(also, for autoconf it turned out that many of the
`customizations' we did, proabaly better had been done
differently; probably the same of the examples above)

  But I think this is out of scope here. Much too complicated
  and it would be more customizing automake than using a
  feature of it.

 Well, it is true that Automake is less easily extensible than it could
 be. The above could be a step in the right direction.

Yes, sure, but there are risks as well, you know, for example
`For each useless function there will be one who calls it'
(making future API cleanups harder) and `if the API has a clean
way a thing is supposed to be done but also have flexible hooks
archiving the same, half the people will only use the hooks' and
the experience that over-designed APIs can harm when finding that
the later needed case isn't supported althrough so many unneeded
cases are and finding it harder to fix the over-desigend API
instead of extending a minimalistic one. Not having good examples
showing the need for an API extension could be an indicator for that :)

oki,

Steffen




Re: Include directive for all generated Makefile.in

2010-04-14 Thread Steffen Dettmer
On Wed, Apr 14, 2010 at 7:53 AM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
  would it be a potential possibility instead to `overwrite and
  specialize' some macro?

 With some macro, you mean some prepended or appended makefile.am
 snippet here, right?

 Well, my idea of the above would be that if you used
  AM_MAKEFILE_APPEND([bot1])
 [...]

Yes, of course. The idea is good!

 Do you have a good use case for this overwriting (that would
 justify this complication)?

No, I don't have any.

It's just that some of our (completely unrelated) source
generation tools (written in Perl) internally use some to write
content. First we had some prepend/append hooks but later found
it stronger to overwrite the content writing functions.
For example, when having some

   @lines = (header(), content(), footer());
   writeFile(@lines);

sometimes maybe content() could be overwritten to some

   # comment out a block
   sub overloadedContent() {
 @content = content(); # or self-SUPER::content()
 map { $_ = # $_ } @content; # or whatever
 return @content;
   }

or replace some identifiers inside, remove something or alike.
Probably all this are examples of DONT-DOs because relying on
internals that change without prior notice.

But I think this is out of scope here. Much too complicated and
it would be more customizing automake than using a feature of it.

oki,

Steffen




Re: revision control info in generated files

2010-04-13 Thread Steffen Dettmer
On Mon, Apr 12, 2010 at 4:16 PM, Jef Driesen jefdrie...@hotmail.com wrote:

 On 12/04/10 15:58, Peter Johansson wrote:
 
  Jef Driesen wrote:
  
   On 12/04/10 14:59, Peter Johansson wrote:
   
Also, I would try avoid distributing `version.h', but not
sure how to do that from top of my head.
  
   Why would you not distribute it?
  
  Well, it's a matter of taste, but I see no real reason to
  include it in the tarball.

I think including the generated source in the tarball is wrong
and can break things. First thing that will break is that you
will have one version.h in srcdir (from the tarball) and
optionally a second one in builddir (from config.status/make).
(config.h also isn't included in the tarball.)

  Yeah, but you don't need autotools to generate `version.h'. You only
  need make, `version.h.in', and `version' of which the two latter are
  already included in the distribution, right?

 True, but make, sed, etc are usually also not available in a
 msvc environment.

What is `your' MSVC environment?
(`ours' does have make, sed etc, for example, because cygwin or
MSYS is needed)

Do you mean you want to build with the MSVC IDE, i.e. not using
make but the built-in magic?
Then you still need some sed or alike to generate the .DSP files
(if you distribute them as well instead of the rules to generate
them, no one can add files or change compiler options, because
they are included in the DSP files).
I think this is in deep contrast to the intention of an automake
src dist (possible, but not recommended, complex and inviting
issues).

 I maintain an msvc project file as a convenience for windows
 developers (I use a mingw cross compiler myself), and there the
 generated files are referenced directly. So if they are missing
 building fails.

Do you want to support the compiler CL.EXE (which is not needing
an msvc project file but can be run via make) or do you want to
support the integrated development environment Visual Studio
instead of using make?

 Generating files in a msvc is possible with custom build rules,
 but it's tricky.

our packages build with make calling CL.EXE (needing MSYS) and
beside generating the autogen source and the binary files, DSP
files are generated, thus afterwards allowing to use the IDE e.g.
for debugging. Even recompilation works (because CFLAGS and
friends are written to DSP files),
BUT
there are limitations and surprises, so I'd not recommend to do
so. make with CL.EXE works fine, but IDE not. You already
mentioned issues with fulfilling the depenencies of autogen
sources (IMHO not well [or not at all] supported by MSVC).
In short, all this is a not really working hack generating an
illusion that MSVC would work, but in fact there are just parts
that `accidently' work.

You do not need IDE, DSP files and IDE complilation because of
debugging; MSVC debugger works both when CL.EXE was called from
MSVC but also when called from make (here, we often compile under
linux with CL.EXE, because this is fast when you have a lot of
autogen sources, but then debug on native windows, because it is
much faster to run MSVC on win). You may have to manually browse
for the first source file (the others are found automatically by
the compiled-in relative path names).

 I just checked, and for a resource file (*.rc) in my project
 that is generated from configure.ac, both the .rc.in and .rc
 file are included in the tarball when I run make distcheck.

An AC_CONFIG_FILES([xxx.rc.in]) does not lead to inclusion of
either xxx.rc.in nor xxx.rc (and how could it, maybe xxx.rc.in is
automatically generated from xxx.rc.in.in or whatever :)).
Usually, you'd add xxx.rc.in to EXTRA_DIST (and xxx.rc to
BUILT_SOURCES). Then you'd get xxx.rc.in in src dist but xxx.rc
generated by configure and updated by make, working correctly
thanks to autoconf/automake.

oki,

Steffen




Re: Include directive for all generated Makefile.in

2010-04-13 Thread Steffen Dettmer
 Now, I wish to include this rule in every Makefile generated from
 Makefile.in that are themselves generated from Makefile.am.
...
 However, I don't want to add the include instruction in the Makefile.am,
 in fact, I don't want to modify those files at all.

 My question is : is there any way to get the same result by modifying
 another automake's config file, like configure.ac for instance ?

ahh, so you ask if it is possible to lets say to use an extended
Makefile(.in) template?
Such a way I also would like to learn. Could be used to emulate custom
recursive rules maybe.

oki,

Steffen




Re: Include directive for all generated Makefile.in

2010-04-13 Thread Steffen Dettmer
On Tue, Apr 13, 2010 at 7:53 PM, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 * Xavier MARCELET wrote on Tue, Apr 13, 2010 at 09:38:36AM CEST:
 For example, we could have a couple of macros

 # AM_MAKEFILE_PREPEND([FRAGMENT], [SUBDIR-PATTERN])
 # -
 # Prepend FRAGMENT file to all Makefile.am files matching SUBDIR-PATTERN.
 # Multiple fragments are included in LIFO order.

 # AM_MAKEFILE_APPEND([FRAGMENT], [SUBDIR-PATTERN])
 # 
 # Append FRAGMENT file to all Makefile.am files matching SUBDIR-PATTERN.
 # Multiple fragments are included in FIFO order.

would it be a potential possibility instead to `overwrite and
specialize' some macro? If, for example, there would be some
logic consisting of OUTPUT_HEADER OUTPUT_CONTENT OUTPUT_FOOTER
could someone overwrite one of them to add some lines?
I guess this probably could be more flexible (but no idea if this
makes any sense in this context).

oki,

Steffen




Re: Building prog first

2010-03-24 Thread Steffen Dettmer
On Tue, Mar 23, 2010, Reuben Thomas r...@sc3d.org wrote:
 On 23 March 2010 10:15, Steffen Dettmer wrote:
  * On Mon, Mar 22, 2010, Reuben Thomas r...@sc3d.org wrote:
   * 2010/3/22 Russell Shaw rjs...@netspace.net.au:
[on this ident level, see at the end]
   poor support for installing interpreted languages,
   and also conversely for build-time compiled programs.
 
   Yes, also for coffee-cooking there is poor support only. :-)

 Sure, but autotools is for building programs, not for making coffee.

Yes, but in the same way someone can argue that it is to compile
or cross-compile packages, not to
cross-compile-and-create-tools-on-the-fly.
You can create tools but putting in in an own package (which IMHO
is the common case, usually you do not include compiler or bison
sources etc in the package).

What I wanted to say was that there is a way how autoconf
supports that (having a package for the needed tools), so I would
not like to pay the additional complexity to get a `shorter' way
(which to me even has a bit of a taste of a hack...).

  I don't think build-time compiled C programs shall be
  suppored while cross compiling. I think it already is complex
  enough.  Otherwise you had to do all checks twice and end up
  in many variables with confusing names, and those who are not
  cross-compiling probably accidently will mix them.

 On the contrary, this is a very useful feature (why should one
 not be able to build host programs when cross-compiling?)

Yes, coffee-cooking also would be a very useful feature (why
should one not be able to have coffee while waiting for the
cross-compilation process?) :-)
SCNR.

Autoconf supports that. Just make a package for the tool and
install it. I know this is inconvenient in your special case.
Also I don't like too big package dependencies (a pain if someone
must install heaps of packages to get something compiled - if
someone here disagree, make an experiment: install a ten years
old linux and install a recent 3D game on it or KDE5 or so :-)).

 for which support in autoconf would simplify developers' life
 (even the ad-hoc support in binutils that I mentioned is pretty
 easy to use).

Yes, I see your point.
But it's complex... How do users specify to use a non-standard
compiler with special flags to compile your helper tool?

I though of perl, but (A), i don't like slow tools,
 
  (I think Perl is fast)

 Me too, the above assertion was not written by me! You missed
 the author line at the top from the original author of these
 double-quoted comments.

Yes, I know and the ident level is correct; sorry for not
including the poster's name (I fixed it this time, hopefully
correct, gmail threading is not that good and in my mutt box I
already deleted the older messages).
  (I didn't wrote to you but to the list and I never ever wanted
  to blame or critise anyone!)

oki,

Steffen




Re: Building prog first

2010-03-23 Thread Steffen Dettmer
On Mon, Mar 22, 2010 at 4:44 PM, Reuben Thomas r...@sc3d.org wrote:
 Not true. automake does not have explicit support for building
 programs with the host compiler when cross-compiling, but I
 have done this successfully in the past when I needed precisely
 to build a program on the host when cross compiling, using
 binutils's BFD_CC_FOR_BUILD macro. It's a pity some version of
 this macro isn't in autoconf, or even autoconf-archive; I shall
 do the latter.

I guess this is a hack and a burden to maintain.

When cross-compiling, why compiling a local tool?
Isn't the correct way to natively compile the local tool,
then use it to cross-compile the package?

 This illustrates a weirdness of autotools: poor support for
 installing interpreted languages, and also conversely for
 build-time compiled programs.

Yes, also for coffee-cooking there is poor support only. :-)

I don't think build-time compiled C programs shall be suppored
while cross compiling. I think it already is complex enough.
Otherwise you had to do all checks twice and end up in many
variables with confusing names, and those who are not
cross-compiling probably accidently will mix them.

  I though of perl, but (A), i don't like slow tools,

(I think Perl is fast)

  (C), i find making build-programs
  in C much more concise than scripting and i can debug it in ddd/gdb.

You can debug Perl in DDD.

 This is interesting, as it doesn't match mine or
 commonly-reported experience (translating my build-time
 programs from C to Perl made them shorter, easier to read and
 fix, and no slower to run, although I wasn't doing more than
 grepping 15k lines of C and writing some of it back out again).


$ time perl -e \
  'for($n=0;$n45;$n++) { printf %08d %60s EOL\n, $n, ; }'  x

real0m0.713s

$ cat x.c
#include stdio.h
int main(void)
{
   int n;
   for(n=0; n45;n++) {
  printf(%08d %60s EOL\n, n, );
   }
   return 0;
}

$ time make x
gcc -Wall -Wmissing-prototypes -fstrict-aliasing -D_GNU_SOURCE -ansi
-ggdb  -ggdb  x.c   -o x
real0m0.076s

$ time ./xx2
real0m0.301s


so 713ms vs. 377 ms.

Interesting that up to around 100k Perl is even faster:

$ time perl \
  -e 'for($n=0;$n10;$n++) { printf %08d %60s EOL\n, $n, ; }'  x
real0m0.167s


$ time make x
real0m0.081s
$ time ./xx2
real0m0.079s


(of course those figures are far away from being exact; they just prove
how fast perl is: same magnitude as C)


:-)

SCNR.


oki,

Steffen




Re: Building prog first

2010-03-23 Thread Steffen Dettmer
(OT)

On Mon, Mar 22, 2010 at 11:50 PM, John Calcote john.calc...@gmail.com wrote:
 Reuben, you've just hit upon one of the two most significant
 problems with Javadoc and the like (including doxygen, man
 pages, and info pages):

sorry, I cannot leave this, because this would be an excuse for
people `but we have to use Javadoc, so we cannot document well',
which is not true (you said this in your point 2, but I have to
stress it :-)).

It is not a problem of the tools, but of the documentation.
When the main pages in Javadoc and Doxygen documentation are well
written, introduce well, include examples and reference important
functions, who in turn include example code (often telling more
than 1000 words :-)) and again reference functions often needed
in this context, this can help really a lot.

I think:

1) Someone has to know (learn) the API before starting to use it.
   (read documentation, make examples) If there is no good
   documentation and/or no good examples, it would be great to
   write and contribute :-)


2) Documentation should be written aimed at the target audience.
   As other software, it must be structured well, easy to read,
   understand and maintain. Usually it must evolve, first time is
   always bloody.
   Also, it should be tested (e.g. reviewed).

I think often the problem leading to just have documentation like

/**
 * Uses the passed wizard, which must be a Mage, to do the magic.
 */
doMagic(Mage wizard);

is that people agree that documentation is important but didn't
considered well how to do it best. I'm afraid often documentation
is considered something `that has to be done also', quickly by
the side, instead of considering it as one of the most important
parts of the software (it's easy to fix a bug when the
documentation clears how it should be, but it's hard to fix
documentation when the code behaves oddly).

Well, you all know this but I could not resists to write it anyway :)

oki,

Steffen




Re: Building prog first

2010-03-22 Thread Steffen Dettmer
* On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:
  noinst_PROGRAMS = unimain
  unimain_SOURCES = unimain.c
 
  unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
./unimain$(EXEEXT) $  $@

 BTW, execution of built programs like this makes your package unsuitable
 for cross-compilation.  Just so you're aware of that.

Assuming unidata.tab.c is a C-code table containing the
information from UnicodeData.txt, I think it could be better to
generate it by some shell code (maybe inside the Makefile.am,
saving a tool) or to use perl (for the price of adding perl to
the build dependencies) or, if UnicodeData rarely changes, add
unidata.tab.c to the package and have some `maintainer only'
helper target to build it (with unidata.tab.c as distributed
source file). People who don't care about unidata.tab.c can build
the package even without UnicodeData.txt (if this makes any
sense, I don't know what this is for of course :))

oki,

Steffen




Re: build .o files to specific directory using automake

2010-03-17 Thread Steffen Dettmer
   Usually makefile generated by automake will compile each
   source file and output .o file in the same directory of the
   source file. How to let automake output .o files to a
   specific directory at the same time savely link them to my
   program/library?

 * scleung wrote on Wed, Mar 17, 2010 at 03:29:15AM CET:
  I have searched some related threads and seems that it won't
  work like this way.
  All suggestions were using VPATH build.

What is the problem with this?

  So does it possible to change the current directory in
  configure script to specify the builddir?

 Yes, by calling the configure script *from* the builddir-to-be,
 i.e., with the builddir as current directory.

And it is easy!

If you have `mypackage' with configure.ac, Makefile.am, sources
etc. then just simply do:

$ cd mypackage
$ mkdir Debug
$ cd Debug
$ ../configure --enable-debug  make all check

and you can build a second configuration:
$ cd ..
$ mkdir Release
$ cd Release
$ ../configure --disable-debug  make distcheck

This is lovely, isn't it? :-)

oki,

Steffen




Re: Baked-in paths

2010-03-15 Thread Steffen Dettmer
On Sun, Mar 14, 2010 at 11:29 PM, Reuben Thomas r...@sc3d.org wrote:
 I have a C program which loads some scripts at runtime. These are
 stored in datadir (e.g. /usr/local/share/prog). But I also want to be
 able to run the program from its build directory.

We have some similar situations but failed to find one solution
working well in all cases.
For unit tests, we sometimes (run-time) re-configure the test
object. This might be as simple as:

  static const char *configPath_ = /usr/local/share/prog;
  const char *getConfigPath(void) { return configPath_); }
  void setTestConfigPath(void) { configPath_ = .; }

If the application has something like:

int main(int argc, char *argv[]) { return appXyzMain(argc, argv); }

the test can call appXyzMain with an argv containing `--data-dir=.' option.

Then there are cases where it seems suited to have a list of
directories where to search:

  const char *paths[] { ./tests/, /usr/local/share/, /usr/share };

Some tools (helper scripts) even `guess' if they run from a
sandbox (e.g. by looking if a file Makefile exists in same
directory or compare $argv[0] with @prefix@, but all this seems
to be tricky and surely not portable).

oki,

Steffen




Re: split check target into check and test targets

2010-03-02 Thread Steffen Dettmer
On Wed, Feb 24, 2010 at 7:17 PM, John Calcote john.calc...@gmail.com wrote:
 Alexander's solution is great, though. I'm going to use that one myself.

For this, you'd need to change all Makefile.ams and it isn't working
recursively...

What is with having

  AC_SUBST(TESTS)

in configure.in and running:

  $ make check TESTS=

to skip test execution?

 Additionally, if I want to build a particular check program (perhaps as I'm
 working out the compiler errors, but before I'm ready to actually run the
 tests), I just type make check-program-name from that directory.

You just have to remember to add `.exe' on cygwin and MSYS,
especially within scripts / make rules ($(EXEEXT)),
otherwise (at least GNU-) make uses a default built-in rule
to compile and link check-program-name, typically with
different LIBS. The compiler command line looks plausible but
fails and confuses people :-)

oki,

Steffen




Re: Problem of: variable `main_SOURCES' is defined but no program or library has `main' as canonical name (possible typo)

2010-03-02 Thread Steffen Dettmer
 bin_PROGRA*M*S = main

ahh great, so it caught the typo :-)

oki,

Steffen




Re: distcheck and canonical_*

2010-03-02 Thread Steffen Dettmer
On Fri, Feb 26, 2010 at 2:55 PM, NightStrike nightstr...@gmail.com wrote:
 When doing a make distcheck, why is for instance the --host option not
 propagated to configure without explicitly setting
 DISTCHECK_CONFIGURE_FLAGS?

erm... isn't --host enabling cross-compiling?
And when cross-compiling, make check always fails with some
cannot execute binary or so, so distcheck would always fail?

Or do I miss something?

oki,

Steffen




Re: advice for pre-generating documentation

2010-02-12 Thread Steffen Dettmer
On Thu, Feb 11, 2010 at 5:08 PM, Gaetan Nadon mems...@videotron.ca wrote:
 generated using tools such as doxygen, asciidoc, xmlto, groff, and
 ps2pdf. I can state some reasons why generated docs and included in the
 tarball:

This is interesting and many seem to agree here, but I think this is wrong.
In my team we made bad experiences and now have the rules that:
- a file either is autogen OR in CVS/SVN but not both
- a file either is generated by make OR in srcdist
- generated files (doxygen, libs) are in bindist only
If someone wants to read the compiled documentation (be it HTML or
PDF) or just use any other compiled object (such as a lib), he should
use a bindist or a webbrowser.

We use doxygen is most cases only for API documentation (I think this
is the normal case), but our developers here usually seem never to
read the HTML doc but read the .h (.java...) files.

Including documentation may also be wrong when it is conditional, but
this might be a special case.

When CVS snaps or alike, let's say you check out trunk HEAD and run
make dist, the generated documentation also might be invalid for
example because required Change History information may not be filled
or so. I think typically someone can expect valid documentation only
from released versions.

I think with doxygen it is difficult to get the decencies right, don't
know if it even works, so how do you update the documentation? Are you
re-generating it when running make dist?

 1) convenience
 2) the target platform building the tarball does not have the tool
 3) the target platform building the tarball has the tool, but at the
 wrong version
4) the target may have the tool but different (compatible or
incompatible) dependencies leading to different files be generated
(might be less important for documentation).

 3) unconditionally add html generated files in EXTRA_DIST: this will
 cause 'make dist' to fail if the platform creating a tarball does not
 have the doc tool.

So in each srcdist you include all the HTML, PDF, man pages and
whatever files generated, right? Is this the intended automake way?
I just looked to some arbitrary lib and it has 8 MB sources but 20 MB
doc (just HTML+PDF), so it would bloat up srcdist a lot...
How to avoid to include outdated documentation?

EXTRA_DIST = all the html and doxygen files listed here

Does this mean to list all files individually here? In my example case
I had to list 1422 files, wow...
But how to maintain that? If anyone documents some struct somewhere, a
new file will be generated, but how to know what to add to EXTRA_DIST?
Do you create/use some include filelist.mak or so?

oki,

Steffen




Re: advice for pre-generating documentation

2010-02-12 Thread Steffen Dettmer
On Thu, Feb 11, 2010 at 6:06 PM, Andreas Jellinghaus a...@dungeon.inka.de 
wrote:
 also I wonder:
 what about builddir vs. sourcedir? how do you handle that?
 does automake handle that automaticaly?

make does handle it (at least GNU Make, I don't know others):

If you have in Makefile.am let's say:

EXTRA_DIST = sourcefile.c sourcefile.o
test: sourcefile.c sourcefile.o
@echo $^

it will work even if sourcefile.o is generated from sourcefile.c;
both will be in srcdist. Make will set things like $^ correctly
because VPATH is considered, that means `make test' tells e.g.:

stef...@host:/tmp/steffen/.../build/i386-gnulinux/test # make test
../../../../ccomm/test/sourcefile.c sourcefile.o

  -- cool aint??? :-)

but I would ever put generated sources to EXTRA_DIST, because in
the dist in this example file you would find:

.../test/sourcefile.c
.../test/sourcefile.o

then if you compile from that dist with builddir != srcdir and
deps enforce generation of sourcefile.o you end up with 2
sourcefile.o files: one in srcdir and other in builddir, which
(IMHO) must never ever happen.

oki,

Steffen




Re: How to handle data/script files in a VPATH build ?

2010-02-12 Thread Steffen Dettmer
Hi Ralf,

thanks again for your helpful message. It is interesting how many
mistakes (non-portable constructions) can be in such a small
snipped I wrote. Thanks for spotting.

On Wed, Feb 10, 2010 at 9:29 PM, Ralf Wildenhues wrote:
 * Steffen Dettmer wrote on Wed, Feb 10, 2010 at 11:01:34AM CET:
 module=xyz
 _mak=${module}.mak
 echo # Autogenerated by $0   ${_mak}
 echo ${module}files = \\  ${_mak}
 find directoty1 directory2 directory3 -type f \
-not -path '*/CVS*' \

 With find, -not is not portable, but ! is (suitably escaped when used in
 an interactive shell); -path is not portable either.

ohh, how bad...
I've read a bit about find in info find (I have no `info
findutils' page). It does not tell much about portability but
still a lot... (but I didn't find how to write -path correctly,
fortunatly we can require to have a GNU find :-)).

Just because of my curiosity, when writing portable packages
(i.e. packages compiling ON many platforms), on which platform
this is recommended to do so? GNU/Linux is great for working
because it has all the efficient tools, but bad for testing,
because it is too powerful and everything works :)
Or is there some `mini linux' or alike that uses e.g. busybox
find etc? Then someone could have a virtual machine to test.

-not -path '*/*.sw?' \
-exec echo {} \\ \; \

 POSIX find only allows you to use {} as a single argument IIRC; you can
 just pass arguments to echo separately here: `-exec echo {} \\ \;'.

ohh interesting, thanks. Yes, if I look to the right place
(http://www.opengroup.org/onlinepubs/9699919799/utilities/find.html)
and understand correctly, you do remember correctly:

Historical implementations do not modify {} when it appears
as a substring of an -exec or -ok utility_name or argument
string.  There have been numerous user requests for this
extension, so this volume of POSIX.1-2008 allows the desired
behavior. At least one recent implementation does support
this feature, but encountered several problems in managing
memory allocation and dealing with multiple occurrences of
{} in a string while it was being developed, so it is not
yet required behavior.

   | sort ${_mak}

 With sort, you should always normalize the locale, i.e.,
  LC_ALL=C sort

Is having `LC_COLLATE=POSIX' also sufficient and correct?
(we have this in /etc/profile). But I added export LC_ALL=C to all
that scripts to go sure :)

 Well, don't look at the GNU find(1) manpage if you're looking for
 portable options only.  That's what POSIX/SUSv3 is for; the findutils
 info pages are more verbose about portability, too.

How do I get the findutils info pages? Is this `info find' or is
there another one?

oki,

Steffen




Re: How to handle data/script files in a VPATH build ?

2010-02-08 Thread Steffen Dettmer
On Sat, Feb 6, 2010 at 6:56 PM, ralf.wildenh...@gmx.de wrote:
 [...]
 data-in-build-tree: data-in-source-tree
cp $(srcdir)/data-in-source-tree data-in-build-tree

We typically write something like:

# file /must/ be in current dir (builddir) for proprietary tool:
__heap.o: lib/__dfl_heap.o
cp -f $ $@

Make finds $(srcdir)/lib/__dfl_heap.o via VPATH, if any, and
sets `$' accordingly
  (of course this won't work if in the above example the
  filenames (including path parts, if any) of data-in-source-tree
  and data-in-build-tree and exactly the same).

I case this is portable (is it?) I think it could be a touch better?

oki,

Steffen




Re: Creating a partial library

2010-02-03 Thread Steffen Dettmer
On Wed, Feb 3, 2010 at 8:33 AM, John Calcote john.calc...@gmail.com wrote:
 (PIC-based static only) library is to use the noinst prefix. But libtool
 can be used to manually install a convenience library, so you could use
 libtool to do this in an install-exec-local rule in the Makefile.am file
 that builds (for instance) libhello.a (untested):

 install-exec-local:
libtool --mode=install ./install-sh -c libhello.a
 $(DESTDIR)$(lib)/libhello.a

 This example came from the libtool manual (modified slightly for Automake
 context).

ohh this is interesting. Isn't this breaking `make uninstall' and
thus `make distcheck'?  Would it be possible (better/suited/correct)
to have some lib_LIBRARIES=libother.a with a custom build rule
that simply copies the file? Then make install/uninstall could
work, but maybe this breaks other things?

oki,

Steffen




Re: silent installs

2010-02-02 Thread Steffen Dettmer
* John Calcote wrote on Fri, Jan 29, 2010 at 14:22 -0700:
 On 1/29/2010 10:17 AM, Steffen Dettmer wrote:
 Why do passenger train windows have curtains?

 Okay - I can't help it! I bet the engineer's windows don't have
 curtains.

:-)

I think we have to accept that there are different requirements
and use cases. For example some teams may have a few build-stuff
maintainers but many developers and don't change build rules
often - so in 95% of the builds done for 90% of the people it
does not matter what the building does; only whether make check
returns success matters. They may want to have a clean `make -s
distcheck' output. Best might be module name and two progress
bars while building and a big red (screen filling) error message in
case anything fails (then, someone can check without using `-s').

In the team I work in it is common to let *conf* run in the
background, because most people do not look boring output for
15-45 Minutes quickly scrolling through (happens when version
numbers were increased, forcing rebuild of many files of course).

Others may deal with integrating packages almost all the day,
they are focused to the building and may be used to work and
almost any problem they met while doing so is related to
building, and thus they always want to see the output.

So I think whether to pass `-s' to make is up to the user who
calls it. If `-s' is passed, I think it is obvious that make
install shouldn't tell much (anything). For those who dislike it
this is no problem: just do not pass `-s' :-)

oki,

Steffen




Re: silent installs

2010-02-02 Thread Steffen Dettmer
On Sat, Jan 30, 2010 at 2:57 PM, Joakim Tjernlund wrote:
 * Ralf Wildenhues ralf.wildenh...@gmx.de wrote on 2010/01/30 00:34:17:
  First off, `make -s' is both POSIX and portable.
  Conceptually, `make -s' has nothing to do with the
  `silent-rules' option that recent

 Exactly, and I am asking for autotools/libtool not to output
 anything that isn't a real warning/error when when -s is passed to make.
 After all, it is custom that stderr is reserved for errors/warnings only.

BTW, isn't it even common to not print debug / trace / notice
messages /at all/ by default?

I thought there was some be quite on execution but verbose on
error principle, but with google I cannot find it anymore...

I think make is an exception here. Many tools don't tell anything
in positive case (gcc, ld, install...).

BTW, is there a way (except -n) to make GNU Make show
commands starting with '@'? Not even make -d shows them, which
IMHO is not very convenient...

oki,

Steffen




Re: doxygen in .h files and make tags

2010-02-02 Thread Steffen Dettmer
On Fri, Jan 29, 2010 at 11:39 PM, ralf.wildenh...@gmx.de wrote:
 * Steffen Dettmer wrote on Fri, Jan 29, 2010 at 02:10:16PM CET:
  here we use doxygen to comment functions in the .h files.
  When using make tags, tags for the definitons but not for
  the declarations are generated. In case of own functions
  this is great (you jump to the implemenations when analysing
  code) but in other cases it is not and someone may want to
  see the documentation.
 
  What best practices exist here?

 You can install Exuberant Ctags and let it tag declarations for
 you, too (use the --LANG-kinds= option).

ohh indeed, `make ctags' generates tags_c that also contains refs
to the .h files by default!

 I think cscope can parse declarations as well; git Automake
 provides a 'make cscope' rule.

stef...@raven:/tmp/steffen_exp/new-autotools/systest_exp # make cscope
make: *** No rule to make target `cscope'.  Stop.
stef...@raven:/tmp/steffen_exp/new-autotools/systest_exp # grep -i scope Mak*
stef...@raven:/tmp/steffen_exp/new-autotools/systest_exp # head -1 Makefile
# Makefile.in generated by automake 1.11.1 from Makefile.am.
stef...@raven:/tmp/steffen_exp/new-autotools/systest_exp #

What do I wrong?
(personally I don't know cscope, but a team mate is using it)

 I don't think either of those distinguish between functions
 that you also define and those that you don't define, but why
 should you declare functions from third parties?

Typically, when the headers do not compile, clash, use wrong
types, are missing `const' qualifiers or generate warnings (if
files should not be changed, and only one or very very few
declarations of it are needed).

But back to `make tags', how it is intended to be used, should
users have something like `export ETAGS_ARGS=--declarations' in
their ~/.profile? But this does not work (does not influence
$(ETAGS_ARGS)).

Is there something like `./configure ETAGS_ARGS=--declarations'
(which unfortunately does not work with autoconf-2.65)?

Using `make ETAGS_ARGS=--declarations' works but is inconvenient.
How to set it best?

oki,

Steffen




Re: Installation of third-party SUBDIRS

2010-02-02 Thread Steffen Dettmer
On Mon, Feb 1, 2010 at 8:36 PM, matwey.korni...@gmail.com wrote:
 I use a couple of third-party libraries in my software. I use SUBDIRS
 variable in my Makefile.am and AC_CONFIG_SUBDIRS in my configure.in. How to
 suppress installation of SUBDIRed projects? I just use they for static
 linkage with my binary and don't need third-party headers and libraries
 installed.

We usually install by scripts and have one or few own modules
(let's say one mypkg but many libraries we won't install) and
simply use `make -C mypkg install'. In some systems we have an
install: # rule invoking something like
for d in */pkg/inst ; do make -C $d install || exit 77; done
or so.

oki,

Steffen




Re: silent installs

2010-01-29 Thread Steffen Dettmer
On Fri, Jan 29, 2010 at 9:21 AM, Ralf Corsepius rc040...@freenet.de wrote:
 Silent make rules are harmful:
 - Bogus defines []
 typically do not show up as compiler warnings or errors.

Could you please explain that?
Here, most either use make from vim/emacs and use $EDITOR as error
message parser or use make -s because without -s in recursive make
and/or bigger projects error messages and warnings are hard to see in
thousand lines of console output.

 Silent building is only appropriate when a user knows what he is doing and
 when explicitly asking of it.

typing make -s is explicitly asking, isn't it?

 When getting used to doing so rsp. when making
 silent make-rules the default, packages tend to gradually rott, because bugs
 tend to slip through unnoticed.

I think he asked that make -s install should be less verbose that it
is now (now, the instlall command lines show up).

oki,

Steffen




doxygen in .h files and make tags

2010-01-29 Thread Steffen Dettmer
Hi,

here we use doxygen to comment functions in the .h files. When using
make tags, tags for the definitons but not for the declarations are
generated. In case of own functions this is great (you jump to the
implemenations when analysing code) but in other cases it is not and
someone may want to see the documentation.

What best practices exist here?

Steffen




Re: silent installs

2010-01-29 Thread Steffen Dettmer
On Fri, Jan 29, 2010 at 3:26 PM, Ralf Corsepius rc040...@freenet.de wrote:
 On 01/29/2010 02:05 PM, Steffen Dettmer wrote:
 Could you please explain that?

 Example: Compling a package under linux

 configure --prefix=/usr 
 ...
 gcc -DCONFDIR=/foo/bar -DIRIX ...

 Using silent make rules you will not notice the bogus -DCONFDIR at
 compilation time. If you're providing package binaries your users will
 likely encounter run-time errors.

why is CONFDIR bogus? How should a user notice it? You mean a user
should know that this package isn't evaluating CONFDIR #define or a
spelling error or so? What runtime errors do you mean?
We use for instance -D_REENTRANT but why would anyone want to see it
when using make -s? There are hunderds of other defines that could
be wrong in sooo many files :-)

 Whether -DIRIX will cause problems would depend on a package's details.
 It's not unlikely compilation will succeed but use source-code which wasn't
 intended to be used under Linux.

Maybe the user wants to enable his Internationalized Resource
Identifier eXtension?

 typing make -s is explicitly asking, isn't it?

 With gnu make, yes. But is it portable to other makes?

good question. I don't know. I pass -s only to GNU make (mostly
because I use only GNU make :-)).

Steffen




Re: silent installs

2010-01-29 Thread Steffen Dettmer
On Fri, Jan 29, 2010 at 5:21 PM, Bob Friesenhahn
bfrie...@simple.dallas.tx.us wrote:
 Regarding silent installs: Why do passenger trains have windows?

Why do passenger train windows have curtains?
SCNR :)

oki,

Steffen




Re: Shared Libraries

2010-01-27 Thread Steffen DETTMER
* Philip Herron wrote on Wed, Jan 27, 2010 at 16:22 +:
 Hey
 
 Thanks sorry about that feel a bit stupid now, but i didn't know it
 was as simple as that i though you needed pkg-config setups to get
 correct linking strings. Is it really as simple as
 /usr/local/lib/libbla.so link against it with the -lbla.

Isn't -lbla linking /usr/local/lib/libbla.a (.a not .so)?

oki,

Steffen


-- 






























--[end of message]-8===


 
About Ingenico: Ingenico is a leading provider of payment solutions, with over 
15 million terminals deployed in more than 125 countries. Its 2,850 employees 
worldwide support retailers, banks and service providers to optimize and secure 
their electronic payments solutions, develop their offer of services and 
increase their point of sales revenue. More information on 
http://www.ingenico.com/.
 This message may contain confidential and/or privileged information. If you 
are not the addressee or authorized to receive this for the addressee, you must 
not use, copy, disclose or take any action based on this message or any 
information herein. If you have received this message in error, please advise 
the sender immediately by reply e-mail and delete this message. Thank you for 
your cooperation.
 P Please consider the environment before printing this e-mail
 
 




Re: libtool issue in a Makefile.am reference

2010-01-25 Thread Steffen Dettmer
Hi!

On Mon, Jan 25, 2010 at 8:25 AM, Murray S. Kucherawy m...@cloudmark.com wrote:
 I've got a package that first builds a library and then a
 binary that links to the library.  The binary build references
 it via:

 progname_LIBADD = ../libdirectory/libraryname.la

I'm not sure if we do it correctly, but we'd set
LIBS=-L$(top_builddir)/libdirectory -lraryname
in such cases (in configure. ohh and we don't use libtool).

oki,

Steffen




Re: How to use install-data-local conditionally with automake-1.6.3

2010-01-22 Thread Steffen Dettmer
(OT)

Hi Ralf!

On Thu, Jan 21, 2010 at 9:48 PM, ralf.wildenh...@gmx.de wrote:
 * Steffen Dettmer wrote on Thu, Jan 21, 2010 at 02:15:57PM CET:
 Perl 5.006 required--this is only version 5.00503, stopped at -e line 1.

 I'm really not interested in bug reports against 8-year-old 1.6.3.
 Perl 5.6 is ten(!) years old, if you can't update at least your
 development environment to that, then I'm afraid I cannot help you.

:)
Yes, of course you are absolutely right.
I came with an extreme example of course :)
Yes, this ancient machine installation it is 10 years old. And still
running as on its first day :) Great, isn't it? Close to the
definition of stability :)
Yes, it still works, so why change.
(ok, ok, you mentioned many many very good reasons why :))

As soon as some time will be avialable I'll try to build a few systems
with a recent version.

Maybe, if this does not harm, and I guess it shouldn't be, support for
1.6.3 still can remain by writing Makefiles working with 1.6.3 up to
1.11 and newer. At least this would be the ideal.

 Issues with updating to newer Autoconf and to newer Automake usually
 require you to go through their respective NEWS files and addressing
 documented incompatible changes.  The rest should be things that were
 never well-defined.

I'm afraid we were sometimes a bit ... erm... pragmatic when writing
*.mak files...
We have some libraries in maintenance since more than ten years. It
think it is great that it still works with only a few changes. In
meantime, I guess, several IDEs were in and now are out and
forgotten. Java changed half the language, at least :)
(Personally, I really like stable things. I think, if the recent
state-of-the-art technology would be a little bit slower, benefits
could arise and the world perhaps would see less crap and
one-hit-wonder-apps :) I love that I can develop on a 10 years old
linux host. I hate that I cannot surft the web with a 3 years old
browser and cannot update a 2 years old linux...)

 Ohh, and automake-1.6.3 is much faster that 1.11, right? :-)

 If you have issues with the speed of newer Automake releases, then
 show me your package build system setup.  We may find bottlenecks in
 Automake code or things to do better in your build system.

Ohh thank you. Your support and offer are outstanding and stunnig.
Yes, you are right, probably automake performance is (almost) no
issue. Other optimisations surely help more (for example, I think
using fewer Makefiles would speed up a lot).

 OTOH, many rules that new automake generates are faster than those of
 older automake.

ohh good point, too.

oki,

Steffen




Why variable `FOO' not defined even if not needed? What to do instead?

2010-01-21 Thread Steffen Dettmer
Hi,

in a include *.mak a file is created and added to `mydir_DATA'.
The including (super-/caller) Makefile should be able to change
the default of this file name. If the Makefile takes no action
(except directly or indirectly including this *.mak), the
defaults should be used.

To ease maintenance, the order of inclusion of *.mak should not
matter.

We came closet to this goal by using:
TO_BE_USED=$(firstword $(OVERWRITE) default)

Makefile.am:

---8===
LIST=$(firstword $(PREPEND) default)
test_DATA=$(LIST)
testdir=.
testlist:
@echo $(LIST)
===8---

automake fails:
test/Makefile.am:24: variable `PREPEND' not defined

but `make -f Makefile.am testlist' works:
default




This try fails:
---8===
LIST=2
LIST?=default
testlist:
@echo $(LIST)
===8---

automake: test/Makefile.am:30: bad macro name `LIST?'


but `make -f Makefile.am testlist' works:
2

This has also the disadvantage that because of ?= the value
depends on the include order of the *.mak helpers.
But in first inclusion order does not matter, even if PREPEND (or
OVERWRITE) is defined later than LIST, i.e. Make can do:

---8===
LIST=$(firstword $(PREPEND) default)
PREPEND=overwritten_later_but_works! (only firstword, so _ used :))
testlist:
@echo $(LIST)
===8---


# make -f Makefile.am testlist
overwritten_later_but_works!

This also works with automake as long as PREPEND is defined:

---8===
test_DATA=$(LIST)
testdir=.
LIST=$(firstword $(PREPEND) default)
PREPEND=overwritten_later_but_works! (only firstword, so _ used :))
testlist:
@echo $(LIST)
===8---

make: *** No rule to make target `overwritten_later_but_works!', needed by `all-
am'.  Stop.

so correctly tried to build overwritten_later_but_works!.

With if COND, configure.in must know about, even if COND isn't
used (i.e. not set), because of
`Makefile.am:31: COND does not appear in AM_CONDITIONAL')


Because I didn't know how to handle I added to all Makefile.am
with that problem a PREPEND= at the end, but I think there is a
better way to archive this?

Is there a way to archive the goal? What would be best to handle
this?


oki,

Steffen




Re: How to use install-data-local conditionally with automake-1.6.3

2010-01-21 Thread Steffen Dettmer
On Wed, Jan 20, 2010 at 10:20 PM, ralf.wildenh...@gmx.de wrote:
 I agree that it's awkward.

autoconf  automake are the best and most powerful build tools I know.
It's not too easy to learn but luckily there is free support on
mailinglists :-)

 A simple way to avoid the warning is to do the install-data-local rule
 addition unconditionally and declare the feature rules in both
 conditions:

 # these rules could be in some include `featureinst.mak':
 install-data-local: myinstfeature
 if FEATURE
 myinstfeature:
touch $(DESTDIR)$(prefix)/feature
 else
 myinstfeature:
 endif

ohh that easy it is?
For some reason I never had this idea... mmm...

Thanks a lot!

 I've only tested that with current Automake now, though, but I think it
 should work with older.

I tested with 1.6.3 and found it working :)

 BTW, you should really update to a newer Automake version.

I'm afraid that this won't be that easy:
./automake-1.11.1 # ./configure
Perl 5.006 required--this is only version 5.00503, stopped at -e line 1.
configure: error: perl 5.6 or better is required; perl 5.8.2 or better
is recommended.

:-)

well and I'm afraid our Makefile.am could not work with recent
automake versions because we used invalid constructions (because
of not knowing it better or obsolete but still active
automake-1.4 workarounds). At least we have issues with recent
autoconfs (I mean, the issues are with our scripts of course but
now get spotted).

Ohh, and automake-1.6.3 is much faster that 1.11, right? :-)

 We've recently fixed a security-related issue in the 'dist' and
 'distcheck' rules, that is present in all older Automake versions.

ohh yes, this is a nasty one...
Thanks for bringing it to my attention.

oki,

Steffen




Re: How to use install-data-local conditionally with automake-1.6.3

2010-01-21 Thread Steffen Dettmer
On Wed, Jan 20, 2010 at 10:20 PM, ralf.wildenh...@gmx.de wrote:
 * Steffen wrote here:
  but it does create Makefile (BTW, isn't this a bug?)
 Yeah, this is a bug...

Ohh this is fixed as least with 1.10: if automake aborts, the old
Makefile is preserved and automake is run again on next make. So
this error cannot be bypassed any longer by running `make || make'.
(So a bug just in my old version, sorry).

oki,

Steffen




How to use install-data-local conditionally with automake-1.6.3

2010-01-20 Thread Steffen DETTMER
Hi,

I hope I don't ask a FAQ, but I didn't find the answer in the web
so I decided to ask here :)

If there is a complex installation thing `foo', like extracting
some archive:
  tar xzf $(srcdir)/$(requiredfiles) -C $(DESTDIR)$(requiredfilesdir)/FILES
and a conditional complex `feature', someone could write the
following invalid Makefile.am:

---8===
install-data-local: myinstbase

uninstall-local: myuninstbase

myinstbase:
mkdir -p $(DESTDIR)$(prefix)
touch $(DESTDIR)$(prefix)/foo
myuninstbase:
rm -f $(DESTDIR)$(prefix)/foo
-rmdir -p $(DESTDIR)$(prefix)

# these rules could be in some include `featureinst.mak':
if FEATURE
install-data-local: myinstfeature
uninstall-local: myuninstfeature
myinstfeature:
mkdir -p $(DESTDIR)$(prefix)
touch $(DESTDIR)$(prefix)/feature
myuninstfeature:
rm -f $(DESTDIR)$(prefix)/feature
-rmdir -p $(DESTDIR)$(prefix)
endif
===8---

automake-1.6.3 tells:

---8===
test/Makefile.am:1: install-data-local defined both conditionally and 
unconditionally
test/Makefile.am:3: uninstall-local defined both conditionally and 
unconditionally
make: *** [Makefile.in] Error 1
===8---

but it does create Makefile (BTW, isn't this a bug?) and it seems
to work correctly.

Could someone please tell why automake explicitely catches this
up and how to do this correctly?

Ideally myinstfeature would be defined in some feature.mak (to
allow to reuse it across several Makefile.am):

---8===
install-data-local: myinstbase

uninstall-local: myuninstbase

myinstbase:
mkdir -p $(DESTDIR)$(prefix)
touch $(DESTDIR)$(prefix)/foo
myuninstbase:
rm -f $(DESTDIR)$(prefix)/foo
-rmdir -p $(DESTDIR)$(prefix)

include feature.mak
if FEATURE
# enable by polling deps:
install-data-local: myinstfeature
uninstall-local: myuninstfeature
endif
===8---


feature.mak:
---8===
myinstfeature:
mkdir -p $(DESTDIR)$(prefix)
touch $(DESTDIR)$(prefix)/feature
myuninstfeature:
rm -f $(DESTDIR)$(prefix)/feature
-rmdir -p $(DESTDIR)$(prefix)
===8---

How to approach this correctly?

oki,

Steffen


 
About Ingenico: Ingenico is a leading provider of payment solutions, with over 
15 million terminals deployed in more than 125 countries. Its 2,850 employees 
worldwide support retailers, banks and service providers to optimize and secure 
their electronic payments solutions, develop their offer of services and 
increase their point of sales revenue. More information on 
http://www.ingenico.com/.
 This message may contain confidential and/or privileged information. If you 
are not the addressee or authorized to receive this for the addressee, you must 
not use, copy, disclose or take any action based on this message or any 
information herein. If you have received this message in error, please advise 
the sender immediately by reply e-mail and delete this message. Thank you for 
your cooperation.
 P Please consider the environment before printing this e-mail