Re: How do I build debug versions?

2001-03-07 Thread Guido Draheim

Alexandre Duret-Lutz wrote:
 
  "Dean" == Dean Hoover [EMAIL PROTECTED] writes:
 
 [...]
 
  Dean Another thing you could do is make multiple build directories
  Dean and always make in a certain way in them. In other words, you
  Dean could:
 
  Dean mkdir debug-build opt-build
  Dean cd debug-build; $top_srcdir/configure; make debug; cd ..
  Dean cd opt-build; $top_srcdir/configure; make; cd ..
 
 PFE (http://pfe.sourceforge.net/) seems to setup things in a
 way which is close to Shameek's request (it uses Debug/$arch/ and
 Release/$arch/ build directories).  Maybe you can draw some
 ideas from that package.
 

Well, there are a lot more weird things beyond the needs of most people,
buried under some weird assumptions, f.e. here's a little snippet from
the configure.in, it' soo convenient ;-)

case `pwd` in  # implicitly add -DNDEBUG if in path with Release
  */Release/*) OPTIM="-DNDEBUG $OPTIM" ;;
  */Debug/*)   DEBUG="-DDEBUG  $DEBUG" ;;
esac

anyway, there are indeed multiple versions in different subdirectories
of the very same sources, so it may possibly give a few pointers. It
did take some time to figure out how to do it correctly, well, now it 
seems to be rock stable :-))

cheers,
-- guidohttp://PFE.sourceforge.net
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)+++ y++ 5++X-





crosscompiling dll linux-mingw32

2001-04-26 Thread Guido Draheim


hi everyone,

I am still trying to crosscompile a dll on linux with the new
autotools series. Currently I use cvs-autoconf, automake-1.4d
and libtool-1.4 on top of libsdl.org/Xmingw32 cross-tools.

I did just need to change a single line in ltmain.sh which 
enabled me afterwards to actually *build* a dll.

--- ltmain.sh.1.4   Wed Apr 25 01:35:10 2001
+++ ltmain.sh   Thu Apr 26 03:02:17 2001
@@ -2330,7 +2330,7 @@
if test $allow_undefined = yes; then
  if test $allow_undefined_flag = unsupported; then
$echo $modename: warning: undefined symbols not allowed in $host shared 
libraries 12
-   build_libtool_libs=no
+   build_libtool_libs=yes
build_old_libs=yes
  fi
else

also I noticed, that in `configure` $lt_target is not set
anymore (I have a line to extra-delete -lm from $LIBS). With
these little changes, crosscompiling a dll works actually.

now for the more interesting stuff ;-)

There is also a little exe-program that uses that dll.
But make install will fail at that step. the package
is called pfe and I thought to create a pfe.exe that
uses a pfe.dll in the system. It turns out that things
are a bit different, anyway here are the files been built:

pfe
.libs/pfe.exe
libpfe.la
.libs/libpfe-0-30-96.dll

The make install will make a copy of the dll including
the lib-prefix and version-number, no pfe.dll is created. 
Well, I am not sure whether that is the intended behaviour,
I'd expect something different. Anyway, the real bad day
comes from the fact, that the pfe.exe is *not* copied, and
looking into the Makefile(.in) showed, that there are *two*
lines of bin_PROGRAMS ... the last one said pfe$(EXEEXT).
The install-binPROGRAMS-target will check for the existance
of the file to be installed, but pfe.exe does not exist in 
the base-builddir, only in the .libs-buildsubdir. Removing
the second bin_PROGRAMS line, a `libtool install pfe`  is
entered which will barf at me saying that there is no
.libs/pfe to be found. Oh no! well then, I made a copy of
.libs/pfe.exe to a version without the .exe, and finally,
the make-install did complete creating a bin/$host-pfe.exe

anyone can explain this behaviour... and have an idea
how to fix it?

as a side-note: the .exe works fine, as long as the .dll
is copied over from ./lib to ./bin - the i*-pfe.exe seems to
be dependent on libpfe-0-30-96.dll. After resolving the
usual win-problems with exported nonfunction items, I was
even able to build a libtool -module .dll but I have not
come so far to dynaload it. Actually, there is a good chance
that the new autotools series will be able to compile a
dll with a linux-based crosscompiler, or so it seems.

cheers,
-- guido http://guidod.4t.com
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)+++ y++




Re: crosscompiling dll linux-mingw32

2001-04-26 Thread Guido Draheim

Alexandre Oliva wrote:
 
 On Apr 26, 2001, Guido Draheim [EMAIL PROTECTED] wrote:
 
  I did just need to change a single line in ltmain.sh which
  enabled me afterwards to actually *build* a dll.
 
 Looks like you were not using -no-undefined when creating the
 library.  This is required to build a DLL on MS-Windows.
 
correct. (sorry, had problems in other tests before libtool-1.4)

anyway - the interesting thing is about make install and
prog:.libs/prog.exe:bin_PROGRAMS:bin_PROGRAMS:install-binPROGRAMS

currently it seems that the automake-rules assume the existance of
a prog.exe in the builddir, but it is not created in the current
libtoolized make-processing. The current builddir/prog `head` says

# pfe - temporary wrapper script for .libs/pfe.exe
# Generated by ltmain.sh - GNU libtool 1.4 (1.920 2001/04/24 23:26:18)
#
# The pfe program cannot be directly executed until all the libtool
# libraries that it depends on are installed.

from that I'd say libtool knows that CC has created a pfe.exe but
the automake-rules/vardefs expect a builddir/pfe.exe too. A copy
of builddir' pfe to pfe.exe does indeed work. Who's to blame,
libtool or automake?

cheers,
-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)+++ y++ 5++X-




crosscompiled .exe (Re: crosscompiling dll linux-mingw32)

2001-04-26 Thread Guido Draheim

Guido Draheim wrote:
 
 from that I'd say libtool knows that CC has created a pfe.exe but
 the automake-rules/vardefs expect a builddir/pfe.exe too. A copy
 of builddir' pfe to pfe.exe does indeed work. Who's to blame,
 libtool or automake?
 

It is libtool's fault - even that the final link-line gets really 
called with a -o pfe.exe for program pfe, the libtool will create
builddir/pfe and builddir/.libs/pfe.exe. Does the native-build
libtool do sth. different here? I don't have a win32-environment
here at the moment, but I think not (but how do people make
dependent rules on these platforms?).

/bin/sh ./libtool --mode=link i386-mingw32msvc-gcc  -D_export=__dllexport -Wall  -W,-E 
-W,--warn-common -o pfe.exe
-export-dynamic main-stdc.o libpfe.la -lkernel32  -L.libs -lpfe 
generating import library for `libpfe-0-30-97.dll'
i386-mingw32msvc-dlltool --as=i386-mingw32msvc-as --dllname libpfe-0-30-97.dll --def 
.libs/libpfe-0-30-97.dll-def --output-lib
.libs/libimp-pfe-0-30-97.a
i386-mingw32msvc-gcc -D_export=__dllexport -Wall -W,-E -W,--warn-common -o 
.libs/pfe.exe main-stdc.o -Wl,--export-dynamic 
-lkernel32 -L/usr/src/cvs/pfe-30/Release/i386-mingw32msvc/.libs 
.libs/libimp-pfe-0-30-97.a -Wl,--rpath -Wl,/programs/pfe
creating pfe.exe

Any argument why that is how it is?




Re: Auto-tools Win32 Borland C++ Builder

2001-05-23 Thread Guido Draheim

Martin Hollmichel wrote:
 
 Hi all,
 
 I think the great misunderstanding is that the autotools are
 not targeting real multiplatform development, but Unix centric
 distribution of (GNU) OpenSource Software.

well, they are not restricted to the *but* IMHO. They are
just not 100% ready-made however it can do quite some
multiplatform development even now - and indeed quite more 
than most (all?) other tools around.

 
 To do real multiplatform, multitools development the autotools
 are difficult to use (IHMO). Try to introduce other compilers then
 (GNU) C, (GNU) C++ Compilers (idl - Compiler, Javac, Resource Compiler,
 whatever compilers, other dependency generators and you
 going mad (in my experience).
 
 I was often ask why we (I'm responsible for OpenOffice.org build
 environment on Unix, Windows and Mac platforms) don't use autotools,
 I say: it's right now not possible and didn't make
 much sense for really big and multiplatform
 development). I would like to, but I can't, sorry.
 
 A few more examples:
 * changing a autotool file, then waiting for configure to write 1200
 makefiles.
 * Mixing up debug and non debug build, do both causes double compile
 time, double diskspace and x-time more RAM for the debugger. Imagine to
 need 10 GB for Openoffice debug build and more than 2GB RAM to start the
 result in a debugger.
 * try to build a four year old glibc on a two year old Linux system or
 vice versa. You have to begin to hack a configure.in.
 * using 30 year old preprocessor technology is not the most comfortable
 way of doing Software Configuration Management (SCM) and script
 development.

I support these arguments too. Probably others share them.

 
 Maybe I'm wrong but is there better bibliography than the info files and
 GNU Autoconf, Automake and Libtool book by Vaughan, Elliston, Tromey and
 Taylor?

ask the latter four ;-)

 
 Don't get me wrong, I think the autotools a great tools and I don't want
 to miss them, but for doing active software development it's not the
 optimal tool.

alternatives? I mean free, documented, mature, easy alternatives to be seen?

 
 Anybody who like to give hints to use autotools for OpenOffice.org ?
 
 Flame me,
 Martin Hollmichel

No way to do flames, you have spoken, and you know what
your are talking about. PAMHO. Well, it's just an
imperfect world, and things are left to be done and
discussed, may be just not on all autotools-groups. IMO,
the strongest argument goes about automake and the
interconnections in ac_output/subdirs-scheme. (doing
the reply to automake-list now)

Personally, the restriction of one Makefile.am for
every(!!) subdirectory has been given me quite a headache
in large projects too. At one point, I did start to
break up the makefile-system, subdirectories did contain
makefiles with the generic rules and they did include
a configured-makefile-snippet from somewhere near the
configure-script, somewhat like overriding just makevars.
reconfiguring ist much faster, however dependencies were
not detected easily (still a subdirs-scheme). Another
approach had been to use one makefile for multiple
subdirs, possible a whole tree. automake however rejects
any source/target living in a subdir - sometimes I had
force-feeding it through a ac_subst that listed the
source-files. IMO, automake is already quite a huge
tool in size, it's not that easy to get extended to
other compile-rules, and easily gets offended by
source/make-trees that are otherwise easy to handle
with plain autoconf-makefiles. One does just not get
the nice rule-definitions automatically copied by
automake.

Anyway, Axel/Michel, using autoconf and libtool is often
almost easy to get working inside the common older setups, 
however using automake in the tasks, well, it's probably 
not best suited, just as you stated, on the other hand, 
looking at the knowledge built into automake, then
I doubt we can see an automake-replacement soon. Who's
going to write it? There are some projects, e.g. about
genericmakefiles, but they have often other problems, 
lack a bit of development and the integration with
libtool developments. A search at http://freshmeat.net
with makesfiles revealed about ten relevant projects,
e.g. prototypemakefiles, troll's tmake or rse's smake.

cheers, guido
  http://savannah.gnu.org/projects/ac-archive

 
  Another scheme is of course the usage of the C++Builder
  as a front-end, and use its project-files to generate
  a makefile(.am/.in) that can make it build in environments
  that don't have a borland compiler. Again, you'd have to
  change away anything that is non-portable across compiler
  sets, so one could start to use gcc's c++ anway, which
  again does not need bcc support in the original setup. So
  I guess, approaching autotools enthusiasts, it may come
  out to the question why you're using borland compiler-chain
  anyway as portability is best achieved with the gcc itself.
  Again, partly a pedagogical endavour (if not flames) 

Re: Auto-tools Win32 Borland C++ Builder

2001-05-23 Thread Guido Draheim


This is not restricted to borland compilers, there are quite
some C-compilers on unix-systems that quite some people like
to see supported, however most of the autotools developers
do live in a quite gnu'ish/gcc'ish environment, and every now
and then, a gmake'ish/gcc pecularity slips in that will break
the next time another tool'ed environment is seen. I would
recommended to use atleast gmake and (ba)sh which are both
available for win/dos, and having the complete gnu fileutils
is not a bad idea either. On this basis, look for any problems
related to assumptions about the gcc, and e-mail the resp.
people that are guilty ;-) Possibly, install a gcc-distribution
like cygwin32 or mingw32 (http://libsdl.org/Xmingw32), start
a normal gcc/automake setup, and run sh configure CC=bcc32.
I guess it will extremly stress your technical skills to
find the bits that assume CC=gcc and a unix-filesystem, and 
furthermore I guess that such work will be very welcome as 
it will help supporting non-gcc compilers on other unix-platforms 
as well, minus the tricks about filepaths. Make sure to use
the newest autotools, the last months have seen quite some
improvements in supporting cygwin/mingw platforms which will
make things quite a bit easier for you.

sorry for the extreme addressing. however, may be there
are some hints whether people have already tried with
borland compilers. I did lately build an nmake-makefile 
for the free(!!) borland 5.5 compilers (which means commandline 
only). However, for a pure C project, the compiler looks
a bit inferior wrt. to gcc, so I'd switch to gcc anyway,
I don't know about C++... all that I've heard so far, that
people did stop at some point to try to get along with the
non-gcc compiler, since the gcc compiler suite is way good
enough for anything that is needed which is the actual 
reason why bcc-support / msvc-support is not answered in
an FAQ. Starting to use gcc on win/dos, well, again, this 
is more a pedagogical endavour, 

Another scheme is of course the usage of the C++Builder
as a front-end, and use its project-files to generate 
a makefile(.am/.in) that can make it build in environments 
that don't have a borland compiler. Again, you'd have to
change away anything that is non-portable across compiler
sets, so one could start to use gcc's c++ anway, which
again does not need bcc support in the original setup. So
I guess, approaching autotools enthusiasts, it may come
out to the question why you're using borland compiler-chain
anyway as portability is best achieved with the gcc itself.
Again, partly a pedagogical endavour (if not flames) that
should be limited to one mailing-list. Possibly libtool.

oops, got a bit long an winded, cheers, guido

Axel Thimm wrote:
 
 sorry for the excessive addressing, but this topic touches all auto-tools.
 
 I am in the process of convincing some people to move their Borland based
 source code development to proprietary free models. As you may guess, this is
 an extremly difficult task, requiring more pedagogical than technical skills
 (and unfortunately myself is extremly Unix-centric, and I still have to learn
 about Borland's peculiarities).
 
 Nevertheless I want to give it a try. As a first step I'd have to move the
 configure/build/install infrastructure to auto-tools, then I'd attack the
 compiler non-portability (and by the end of this decade I might get a GNU
 compilant system ...).
 
 Searching the lists/net nothing helpful came up, but at least there also
 wasn't any evidence of any NoGo-theorems.
 
 Does anyone have already some experience in working with auto-tools and
 Borland from the command line? How do the maintainers/release managers here
 think about it? Would they be willing to accept patches for supporting
 commercial compilers ;)
 
 Regards, Axel.
 --
 [EMAIL PROTECTED]

-- guido Edel sei der Mensch, hilfreich und gut
31:GCS/E/S/P C++$ ULHS L++w- N++@  d(+-) s+a- h.r(*@)+++ y++ 5++X-




Re: Canonical way to conditionally add (a large amount of) sources to PROG_SOURCES?

2001-06-11 Thread Guido Draheim


Hi Tim,

you do confuse me a bit - it seems as if you just missed the
target by micrometer, especially the middle one of your
examples looks very good

 if MYCONDITIONAL
 OPTIONAL = lotsasource.c lotsayacc.y
 else
 OPTIONAL=
 endif
 
 foo_SOURCES = $(REGULAR) $(OPTIONAL)

The only thing that I can see is that you did not yet make
a call to AM_CONDITIONAL (??) - that is the one that sets the
two _SUBST-vars called MYCONDITIONAL_FALSE and its sister
MYCONDITIONAL_TRUE, and these are probably the names that
will be later warned about. Just a guess anyway

So, indeed use the if'..endif mark, and go along with a call 
to AM_CONDITIONAL in the configure.in. I think it should be
documented in the automake-docs, look for a section like
`Conditionals` - to quote from there:

|   Automake supports a simple type of conditionals.
|
|   Before using a conditional, you must define it by using
|`AM_CONDITIONAL' in the `configure.in' file (*note Macros::.).  The
|`AM_CONDITIONAL' macro takes two arguments.
|
|   The first argument to `AM_CONDITIONAL' is the name of the
|conditional.  This should be a simple string starting with a letter and
|containing only letters, digits, and underscores.
|
|   The second argument to `AM_CONDITIONAL' is a shell condition,
|suitable for use in a shell `if' statement.  The condition is evaluated
|when `configure' is run.

and therefore, if you used AM_CONDITIONAL(USE_OPTIONAL,xxx), then use

if USE_CONDITIONAL
OPTIONAL = lotsasource.c lotsayacc.y
endif

foo_SOURCES = $(REGULAR) $(OPTIONAL)

apart from that fact that this *should* work (unless you do other
tricky things), I don't like this scheme, and such is the same
for lots of others IMO - it isn't used very often. Sadly, other 
tricks  are simply not possible with stock automake since automake 
will see the list of source-files and generate a list of object-files 
- and it won't do that at make-time/configure-time, it will be 
done at autotools-time. I call this a misfeature, but for your
problem, the current am_conditional-scheme should be working.
HTH, cheers, Guido


Tim Van Holder wrote:
 
 Hi,
 
 I need to conditionally (based on a --with configure option)
 add a fairly large number (~50) of sources to foo_SOURCES.
 
 First I tried
 
 OPTIONAL=
 if MYCONDITIONAL
 OPTIONAL = lotsasource.c lotsayacc.y
 endif
 
 foo_SOURCES = $(REGULAR) $(OPTIONAL)
 
 but that didn't work; automake complains that OPTIONAL is already
 set in TRUE, which implies MYCONDITIONAL_TRUE.
 
 Moving it to
 
 if MYCONDITIONAL
 OPTIONAL = lotsasource.c lotsayacc.y
 else
 OPTIONAL=
 endif
 
 foo_SOURCES = $(REGULAR) $(OPTIONAL)
 
 didn't help; now I get three separate errors about am_foo_OBJECTS
 already being defined in a condition implied by another one.
 
 So I tried
 
 OPTIONAL = lotsasource.c lotsayacc.y
 
 if MYCONDITIONAL
 foo_SOURCES = $(REGULAR) $(OPTIONAL)
 else
 foo_SOURCES = $(REGULAR)
 endif
 
 which didn't give any warnings, but am_foo_OBJECTS is empty :-(
 
 What is the proper way of handling such a situation?
 (If it's in the manual, please point me to the correct
 chapter; a cursory examination revealed nothing much).




Re: non-libtool, cross-platform libraries

2001-09-25 Thread Guido Draheim

 David Carter wrote:
 
 I need to build a non-libtool dynamically-loaded library, on windows and on HP/UX, 
from c++ sources.
 
 This library needs to be built as foo.dll on windows, foo.so on HP/UX.
 
 I don't think I can use libtool, since the resulting dll/so needs to be used by 
non-libtool-aware vendor software (iPlanet's
 webserver, to be precise).
 
 My maintainer platform is Cygwin, just to make things a little more interesting. 
However, the windows dll is built with
 -mno-cygwin, and linked against the mingw c++ std library.
 
 I've stumbled over the naming of the primary to use. I can't use:
 
 lib_LIBRARIES= foo.dll
 foo_dll_SOURCES= foo.cpp
 
 for example. And automake will not accept a variable in place of the dll, so I 
can't use @DL_EXT@ or similar.
 
 Any suggestions on how to do this within the automake/autoconf framework? I think it 
could be made to work using
 AM_CONDITIONAL, and repeating the entire lib_LIBRARIES for foo.dll  foo.so, but 
that doesn't feel right.
 

use automake and lib_LTLIBRARIES = foo.la
then read the section about .la libraries - this is a linker script 
understood by libtool' which in turn will instruct it to build a .dll
on windows and a .so on most unix platforms. Oh, and read the stuff
about -module -avoid-version for anything that should be installed as
a module. (damn, one day I have to write that dllmaker guide)

HTH, and please don't confuse the libtool lib-linker-tool and the 
libltdl dl-loader library being shipped and supported by libtool.
on win32 and hpux the libltdl would just wrap the native (!!) dl-load 
facility that libtool will build objects for - being .dll/.so files.

I wonder if there had been no automake-examples for that webservers
module stuff. Or didn't you look anyway?
cheers,
-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: Need help with problem...

2001-09-25 Thread Guido Draheim


use the easy way.
the maintanance problems of the hard way have no real benefit over doing
some renamings. 

And if there are other reasons that you are not supposed to 
rename something, well, then you could use the rename trick.
this is - you add the following two pattern rules to your makefile

%-coo.c : %.c
cp $ $@

%-cpp.cpp : %.cpp
cp $ $@

and make the link target dependent on objects one-coo.o and one-cpp.o

 in automake-speak:

lib_LTLIBRARIES = libtest.la 
libtest_la_SOURCES = one-cpp.cpp one-coo.c

%-cpp.cpp : %.cpp
cp $ $@

%-coo.c : %.c
cp $ $@

... which will instantly compile with the following configure.in:
AC_INIT(Makefile.am)
AM_INIT_AUTOMAKE(test,1.0)
AC_LANG_C
AC_PROG_CC
AC_PROG_CXX
AM_PROG_LIBTOOL
AC_PROG_INSTALL
AC_OUTPUT(Makefile)


HTH, cheers, Guido

Eric Lemings wrote:
 
 Hello all,
 
 Suppose you have the following Makefile.am.
 
 lib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES = \
 one.c \
 one.cpp \
 two.c \
 two.cpp
 
 I need the library to contain both the C object files and the C++ object files.  
There are two ways that I can think of to do
 this: the easy way and the hard way.  The easy way would be two rename either the C 
or the C++ source files so that they do
 not have the same basename and thus the object files would have different names.
 
 The hard way would be to add additional variables and rules to Makefile.am so that 
the object files use different suffixes.
 Suppose you use .co, .clo, .o, and .lo suffixes for the static C object file, shared 
C object file, static C++ object file,
 and shared C++ object file respectively.  Using the latest versions of Autoconf, 
Automake, and Libtool, you would first need
 implicit rules to create the .co and .clo files.
 
 .SUFFIXES: .c .clo .co .lo .o .obj
 
 .c.co:
 source='$' object='$@' libtool=no \
 depfile='$(DEPDIR)/$*.Po' tmpdepfile='$(DEPDIR)/$*.TPo' \
 $(CCDEPMODE) $(depcomp) \
 $(COMPILE) -c `test -f $ || echo '$(srcdir)/'`$ -o $@
 
 .c.clo:
 source='$' object='$@' libtool=yes \
 depfile='$(DEPDIR)/$*.Plo' tmpdepfile='$(DEPDIR)/$*.TPlo' \
 $(CCDEPMODE) $(depcomp) \
 $(LTCOMPILE) -c -o $@ `test -f $ || echo '$(srcdir)/'`$ -o $@
 
 Now you'd have to link the C object files with the new suffixes into the target 
shared library.
 
 libfoo_la_COBJECTS = $(libfoo_la_OBJECTS:.lo=.clo)
 
 libfoo.la: $(libfoo_la_COBJECTS) $(libfoo_la_OBJECTS) $(libfoo_la_DEPENDENCIES)
 $(LINK) $(libfoo_la_LDFLAGS) $(libfoo_la_COBJECTS) $(libfoo_la_OBJECTS) 
$(libfoo_la_LIBADD) $(LIBS)
 
 So how would the static library get built?  Did I get the shared library parts right?
 
 Thanks,
 Eric.

-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: Where to place include files

2001-09-28 Thread Guido Draheim


I never had problems with letting header and source files
be in the same subdir which has the name of the project itself.
Instead it gets easier to build different-named subprojects
together at the same time. W.r.t. to a prodesk I'd simply
bind all *.h and *.c into a subdir prodesk/prodesk, there
is just one SUBDIR=prodesk in the toplevel prodesk/ dir, 
and the subdir /prodesk makefile contains just -I$(srcdir)/..

btw, that way it's damn easy to have a gdk/ and gtk/ subdir
side-by-side under the same toplevel makefile and get to the
headers from each subproject - there's  really no need to split 
them into different tarballs/rpms for distribution. But that
may be a different story...



Julio Merino wrote:
 
 Hi all,
 
 I'm wondering about how to organize the header (.h) files on my project.
 The structure is like the following: the directory prodesk, which is
 the toplevel directory, contains a subdirectory named core, and another
 one named include, both hanging from the topdir.
 
 In my header files, I need to do some #include's of other .h files I've
 written. These files are all placed in prodesk/core subdirectory.
 I want to put these files when they are installed on
 /usr/include/prodesk. And here, the problem arises.
 
 I include headers from headers like: (for example)
 #include prodesk/string.h
 
 As you may notice, there is the prodesk/ directory preceding the header
 file, so this will work when it gets installed.
 
 But, to solve the include problems because missing files if the package
 is not yet installed I've done the following. I've created a subdirectory
 in prodesk/include, that is called prodesk itself. So, I try to shadow
 the real include's dir layout.
 
 Then, I've populated that directory with symlinks:
 (wherever I am)/prodesk/include/prodesk $ ln -s ../../core/*.h .
 
 This way, when building the library, it can include the files fine, and
 when is installed I guess that it will work fine.
 
 Do you think this is a logical way to do this?? Or this is weird?
 
 Thanks! (And sorry for this so long message)
 
 --
 Make a better partition table: http://www.jmmv.f2s.com/ept
 
 Julio Merino [EMAIL PROTECTED] ICQ: 18961975
 
   

Part 1.2Type: application/pgp-signature

-- guidoEdel sei der Mensch, hilfreich und gut
GCS/E/S/P C++$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: How to add a directory with datafiles

2002-02-22 Thread Guido Draheim

*RTFM*

xyzdir = ${prefix}/games/blabla/xyz
xyz_DATA = myfile.jpg yourfile.wav
EXTRA_DIST = ${xyz_DATA}

Es schrieb Sander van Geloven:
 
 Hi,
 
 Can anyone give me a simple example of how to add a directory called xyz with
 data
 files to an RPM. I'm using automake and suppose that I have to alter
 Makefile.am, configure.in and blabla.spec.in
 
 It concerns a game (for Mandrake Linux) so i want the files in
 /usr/share/games/blabla/xyz
 
 But how do I do this exactly?
 
 Thanks,
 
 Sander

-- guidohttp://freespace.sf.net/guidod




Re: How to add a directory with datafiles

2002-02-22 Thread Guido Draheim


oh, and don't forget to list the directory in the .spec file,
and otherwise, 'see you on news:alt.os.linux.mandrake

%files
   %_prefix/games/blabla/xyz/*

Es schrieb Guido Draheim:
 
 *RTFM*
 
 xyzdir = ${prefix}/games/blabla/xyz
 xyz_DATA = myfile.jpg yourfile.wav
 EXTRA_DIST = ${xyz_DATA}
 
 Es schrieb Sander van Geloven:
 
  Hi,
 
  Can anyone give me a simple example of how to add a directory called xyz with
  data
  files to an RPM. I'm using automake and suppose that I have to alter
  Makefile.am, configure.in and blabla.spec.in
 
  It concerns a game (for Mandrake Linux) so i want the files in
  /usr/share/games/blabla/xyz
 
  But how do I do this exactly?
 
  Thanks,
 
  Sander
 
 -- guidohttp://freespace.sf.net/guidod

-- guidohttp://freespace.sf.net/guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: RFC: ./configure or ./config.status --clean

2002-04-04 Thread Guido Draheim

Es schrieb Peter Eisentraut:
 
 Akim Demaille writes:
 
  What I'm doing now is buying my freedom.  The freedom to extend
  Autoconf without 1. requiring from the rest of the world that they
  adjust their distclean rules, 2. requiring that Automake folks release
  a newer Automake etc., not to mention that it needs 1. writing
  documentation, 2. telling the people to read it.
 
 All of these are good goals, but there are at least three other ways to
 achieve them:
 
 1. _AC_EXTRA_CLEAN_FILES([configure.lineno autom4te.cache])
 
To be traced by automake.
 
 2. AC_SUBST(ac_extra_clean_files, [configure.lineno autom4te.cache])
 
To be added to DISTCLEANFILES in automake-generated makefiles.
 

AC_SUBST(ac_extra_clean_files, [configure.lineno])
AM_ADD(DISTCLEANFILES,ac_extra_clean_files)

Makefile.am:
   (nothing special)
Makefile.in:
   DISTCLEANFILES = ac_extra_clean_files

good??




Re: lex yacc with C++ projects

2002-04-26 Thread Guido Draheim

Es schrieb Robert Collins:
 
  -Original Message-
  From: Guido Draheim [mailto:[EMAIL PROTECTED]]
  Sent: Friday, April 26, 2002 8:57 PM
  To: Robert Collins; [EMAIL PROTECTED]
  Subject: Re: lex  yacc with C++ projects
 
 
  Es schrieb Robert Collins:
  
   It would be nice to be able to tell automake that we want
  to compile
   the out of lex and yaxx with g++, not gcc. (this is for C++
  projects).
 
  rename mysource.l - mysource.ll
  rename mysource.y - mysource.yy
 
  and that's it. Were the manual pages confusing about that?
  Or is there another problem that is not obvious to me...
 
 Yes, the C++ output from the tools is incorrect for what we are doing.
 (It depends on libstdc++). So all we want is the C output, and then
 compile with C++.
 

hhh. even though I need some enlightment what's wrong with a
libstdc++ dependency for a c++ compiled source - so your project
uses c++ files without libstdc++ and you want to include a parser 
but that would bring you an unwanted lib-dependency?

anyway, since I'm thinking about it, I am wondering if the
following could work - what do you think about such rules...

.lo.tab.c :
   $(LTCXXCOMPILE) -c $

.o.tab.c :
   $(CXXCOMPILE) -c $

... or does that not work with `make`? hmmm, just wondering...




Re: installing config.h

2002-09-04 Thread Guido Draheim

Es schrieb Schleicher Ralph (LLI):
 
 Waldemar Rosenbach [EMAIL PROTECTED] writes:
 
  But what is the best way to deal with the platform depend stuff in the
  interface?
 
 A platform independent interface!
 

among the problems of ac_create_config_h:
* some ac-checks will check the compiler, not the system
   and some of these are dependent on cflags.
* some ac-checks only target the configuration at build-time
   after installing it, packaging it (into an rpm), and
   copying it to another system, the defs do not hold anymore
   due to missing extension packages.
* some ac-checks define missing things, most prominently the
   ones for off_t and const.

how to handle them:
* place an pkg-conf.h along with the generated pkg-config.h,
   which includes the latter, and all the other headers
   include pkg-conf.h. This pkg-conf.h can include a lot
   of things to get platform independence - yet it turns out
   that this pkg-conf.h is _magnitudes_ easier to write if you
   can start off with a generated pkg-config.h
* just be aware of build-time of the original package and the
   time that third-party software includes them. Perhaps use a 
   magic define for it. It is of _great_ help to have the 
   build-time config of your library automatically coded in an 
   installed header file, far better than a pkg-config-cflags 
   to push them into a third party compile. Verrry useful.

what's not a problem of ac_create_config_
* installing for multiple builds, or the problem of some
   local build config.h
* lib dependencies needed to be pushed through to third party
   software, as per means of a pkg-config-libs.





Re: proposal to fork the build-tools projects

2002-10-13 Thread Guido Draheim

Tom Lord wrote:
 are already installed.  Call this set the bootstrap packages.
 
 Let's then formally identify a _subset_ of the bootstrap packages,
 which are those GNU tools that other (non-bootstrap) GNU packages are
 allowed to depend on for the build process.  For example, GNU Make
 would probably be in this subset, as would a GNU shell.  Call this set
 the build environment packages.
 

insert your favourite scripting language. The current automake does
contain perl code to scan makefile syntax. Just expand it - so it
does not generate code to be executed later but to have it executed
immediately. That's enough to identify bootstrap code: perl.

As for gnu make - I know some build managment tools that export a
their own version of `make`. So, do not rely on gnu make. And anyway,
if you have gnu make then you don't need automake in most cases.

I guess this discussion is void - it is not about forking. It is
about creating your own set of buildtools and to try to reach a
level of matureness that we see in the autotools today. There are
other attempts, may be you want to join them in their quest?






Re: Compiler flags

2002-11-10 Thread Guido Draheim
This is an autoconf.at.gnu.org  question...

Michael Lemke wrote:

Today I've been trying to learn automakeautoconf.  Something I can't
figure out although it is the main reason I am investigating automake:

How do I set compiler flags based on the compiler?  For example, g77 
requires -fdollar-ok but Sun Fortran won't like that.  How can I handle
that?  How much does automake know about such problems?

Thanks for any suggestions,
Michael



check at configure-time if the compiler supports the option.
Just try_compile and do not use if the test fails. That's it.

http://ac-archive.sf.net/guidod/ac_check_cc_opt.html







Re: how to use multiple automake's at the same time?

2002-12-03 Thread Guido Draheim
Lukas Österreicher wrote:

Hi there.

I've been having trouble compiling software generated with automake 1.5 (or
whatever
except 1.4) on my redhat 7.2 system. I have tried upgrading to automake 1.5
but that doesnt work nicely - I have damaged my system that way, and i would
not be able to compile automake 1.4 sources then. I have read a thread about
installing automake 1.4 and 1.5 at the same time and let it somehow choose by
the source which to use, and i think this is already available on redhat 8,
but i've had other troubles with that redhat version, so i prefer to use 7.2.

So what I ask is:
How to install automake 1.4 and 1.5 at the same time on a redhat 7.2 system
(quite a few up2date packages installed already) in the described manner so I
dont have to worry about version incompatibilities anymore? (If not else
possible, by choosing the correct version by setting the right PATH).



I've hacked a script that does the trick for me - as I want to use different
combinations of autoconf libtool and automake. The toplevel makefile has
a magic var-set line, so the start of my Makefile.am usually looks like:

AUTOMAKE_OPTIONS = foreign
AUTOTOOL_VERSION = autoconf-2.52 libtool-1.4.2 automake-1.5

Then I call a script autotools that looks into the Makefile.am and modifies
the $PATH accordingly, not much unlike the description about stow, the
combinations are compiled with separate $prefix paths. The helper script and
all the combinations are ready-made as rpm files, just pick them up and
install.

You can find everything in the download section of http://ac-archive.sf.net
in the autotools section - here's the direct link:
http://sourceforge.net/project/showfiles.php?group_id=32081

They are a bit dated, I did not have time to rebuild the rpms with the
newest breed of autoconf and automake generations. And actually, I'm
currently waiting for Mandrake to fix a severe bug in bash before doing
interesting things (sorry to alexandre, the bug affects the ac-archive
rpms making quite definitly, so no good time to roll a new release).

cheers, guido






Re: how to use multiple automake's at the same time?

2002-12-04 Thread Guido Draheim
Lukas Österreicher wrote:

One other thing I just found:
autotools-for-ac.2.13.lt.1.4.2.am.1.5-2sfnet.i586.rpm
depends on perl-base, which is a mandrake rpm.
This probably means that your tools are useless for
distributions other then mandrake - unless you know
a way how to fix that.



$ grep perl /usr/share/auto/autotools.spec.in
Requires: m4, perl

It seems the `rpm` did map `perl` into `perl-base`.
Actually, the mdk rpm'er does denote a lot of other
requirements as well, obviously some automatism
at work (e.g. libc.so.6 *aargh*)

Actually, if you want multiple version of autotools-for
then you need to --force the installation anyway, so
just do so. Remember, these are my personal scripts
and they have no relation to the autotools group other
than me being on these mailing lists.

keep me informed if it works anyway, okay ;-)
have fun, guido






Re: per-file compiler flags?

2002-12-23 Thread Guido Draheim
Markus Werle wrote:

Hi!

Consider the following case:

File VeryImportant.C contains code that really needs some
optimization available. But all other files could (at least during development)
be compiled without optimization, for the sake of not drinking too much coffee
due to long compilation times.

Is there a convenient way to achieve the following behaviour:

If (gcc is the compiler)
   Add to compilation of VeryImportant.C
   the optimizer flags: CPPFLAGS+=-O9 -finline -funroll-loops
else if(intel C++ is the compiler)
  Add the optimizer flags: CPPFLAGS+=-ip -unroll
endif



This is a split question.
(a) per file compile flags:
   - it's an faq and later automake supports it natively by
 recognizing object_CFLAGS specification, in earlier
 versions you would write an explicit rule
 file.o : file.c
$(COMPILE) .
(b) special optimization flags
   - this should be adressed in an autoconf check within
 your configure.ac - push it into AC_SUBST(MYFLAGS)
 and add it to rule (a) like
  $(COMPILE) @MYFLAGS@

Supporting special compile switches for gcc vs. non-gcc compilers
has become an interesting topic in the autoconf-archive, there
are now a few of them prepackaged available.

For there were different approaches available to check for the
different styles of compilers, I'm currently testing a method
derived from the variants I've seen. Perhaps you want to take
it as a guideline for your own ac-macro - and perhaps you like
to have it added to the ac-archive as well.

http://ac-archive.sf.net/guidod/ax_cflags_warn_all.html
http://ac-archive.sf.net/guidod/ax_cflags_no_writable_strings.html

-- cheers, guido






re: no-version / Re: automake not seeing AM_INIT_AUTOMAKE

2002-12-24 Thread Guido Draheim
jlm wrote:

I'm using automake version 1.5. I'm getting an error with automake, and
I can't figure out why. When I run automake, I get this error:
automake-1.5: configure.ac: `AM_INIT_AUTOMAKE' must be used

But I am using AM_INIT_AUTOMAKE, here's the top of my configure.ac file:
#Process this file with autoconf to produce a configure script.
AC_INIT(libgtkbonus, 0.1, [[EMAIL PROTECTED]])
AC_PREREQ([2.52])
AC_CONFIG_SRCDIR([src/trough.cpp])
AM_INIT_AUTOMAKE([no-define])
AM_DISABLE_STATIC
AM_PROG_LIBTOOL

If anyone can enlighten me to the true meaning of this erroneous error
message, I would be grateful. I need to ensure that automake does not
define VERSION, since it is used by a library that I use (hence the
no-define).



am_init_automake takes two arguments, doesn't it...

btw, what do you mean by is used by the libary?? may be the true
solution would be on a slightly different track...








Re: non-installed shared libraries

2003-01-16 Thread Guido Draheim
Gav Wood schrieb:

hi,

sorry if this email is just me being ignorant but i can't find an easy way of 
doing what i need - perhaps you could help.

i'm using automake for a project that implements a plugin system. it has a 
small, non-installed cradle (example) program for testing the library that it 
builds. because it's a plugin system it needs two other small non-installed 
shared libraries be built, to act as plugin examples for the test program.

i can build the non-installed program well using rules:

noinst_PROGRAMS = example
example_SOURCES = example.cpp
example_LDADD = libsucplugin.la

i can then build the two plugins as non-installed static libraries using 
rules:

noinst_LIBRARIES = libselfcat.a libunderline.a

however, they need to be shared. if i try using libtool:

noinst_LTLIBRARIES = libselfcat.la libunderline.la

i end up with two .la files and no linkable shared libraries.

any ideas?


I did never have any problems with making plugin modules, nor
to test them before installation. First of all, I wonder why
you don't use the -module flag of libtool, second there is
wonder why noinst_, and third: what is in your .libs subdir
and whether the (text-type) .la has anything suspicous that
could point to misconfiguration (like using AC_DISABLE_SHARED
in the configure.ac or some such, just doublechecking.) And
last not least, if that is going to be a bug it might be
worth knowing the version numbers of the autotools.

HTH, cheers, guido







Re: CXXFLAGS and linking

2003-01-23 Thread Guido Draheim


William S Fulton schrieb:

Guido Draheim wrote:


William S Fulton schrieb:



I see that CXXFLAGS and AM_CXXFLAGS gets passed to the linker 
($CXXLINK). This doesn't seem correct as the C++ flags aren't 
necessarily appropriate for linking. This isn't consistent with the 
per-program _CXXFLAGS which do not get passed to the linker. Is 
this all as intended?


your version of autotools? (autoconf, automake, libtool)


Sorry, the latest... Automake 1.7.2, Autoconf 2.57, m4 1.4 (I'm not 
using libtool).

Perhaps to make it clearer, the following simple test case demonstrates 
the problem:

---
configure.in:
---
AC_INIT([test], [1.0])
AC_PREREQ(2.57)
AC_CONFIG_SRCDIR([foo.cxx])
AM_INIT_AUTOMAKE
AC_PROG_CXX
AC_CONFIG_FILES([ Makefile ])
AC_OUTPUT

---
Makefile.am:
---
AUTOMAKE_OPTIONS = foreign
AM_CXXFLAGS =-DSOMETHING
test_SOURCES = foo.cxx
bin_PROGRAMS = test


---
foo.cxx:
---
#include iostream
using namespace std;

int main()
{
cout  hi!  endl;
}


The linker output is then:

g++ -DSOMETHING -g -O2   -o test.exe  foo.o



hmmm, personally I'm using libtool support about everywhere
but it is quite similar

CCLD = $(CC)
LINK = $(LIBTOOL) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) \
$(AM_LDFLAGS) $(LDFLAGS) -o $@

however, I guess that libtool strips everything away that is
not understood by the actual linker as configured, right?

hth, cheers, guido






Re: CXXFLAGS and linking

2003-01-24 Thread Guido Draheim
Braden McDaniel schrieb:

Quoting Alexandre Duret-Lutz [EMAIL PROTECTED]:



Braden == Braden McDaniel [EMAIL PROTECTED] writes:



[...]

Braden I've reviewed 7.2 in the GNU Coding Standards, and it's
Braden not clear to me what you're referring to. Where exactly
Braden should I be looking?

The following sentence in the `Variables for Specifying Commands'
section (sorry I don't have section number in the info version).

| `CFLAGS' should be used in every invocation of the C compiler, both
| those which do compilation and those which do linking.



Ah, found it. Thanks.

I guess, then, that this is a bug in GNU make.



Perhaps not. Know what, coding standards might not be correct
forever or be in sync with real world evolution. It might
stem from the fact to use the (g)cc for linking in the gnu
world - but that is not overly general, and the default
gmake rules might point to that, as it perhaps took these
rules from models like sun make. In fact, it might be that
one wants to use a _different_ tool for compiling than
linking, - to pick it from the same toolchain happens to be
the normal case but not the general - or am I wrong here?

Interesting to me: one might guess that libtool could get
away with using a different linker than the objects were
compiled with, interpreting arguments itself. And that
makes this problem to come up only in the presented
case, without a libtool wrapper at hand and neither using
gcc, right?

cheers, guido







Re: Unwanted autoheader call included in Makefile.in

2003-01-28 Thread Guido Draheim


Rafael Laboissiere schrieb:

When I use the macro AC_CONFIG_HEADERS, automake includes a rule in
Makefile.in to rebuild config.h.in through autoheader.  Now, I do not want
at all that this file gets touched by autoheader, even when I modifiy one of
its presumed dependencies aclocal.m4 and configure.ac.



I've been plagued with some implicit and unnecessary `rebuild`s in
some projects - and the solution is quite simple: in your configure.ac
script, reset autotools variables to a colon - i.e.
   AUTOHEADER=:
or perhaps better yet, rewrite it as
   AUTOHEADER=echo autoheader ignored
so you still see when it would have been called.


Is that a way to disable this automake behavior, or am I asking for
something that is complete non-sense?

Thanks in advance for your help.



have fun, guido






aclocal: adding user-home parts to aclocal(/automake) ?

2003-02-15 Thread Guido Draheim
For one of my project there was a bug report regarding
the project.m4 file that is going to be installed. So
far I did just use $datadir/aclocal of course - as it
is uncertain whether some `aclocal --print-ac-dir`
can be written to. Using a prefix-related default
happens to be gnu-style anyway. And it works for a
/usr package just fine using the /usr-installed
automake package.

However, there is hardly a way for a user-home
installation to extend this scheme - in other words,
there is a sysadmin-insallation of automake/aclocal
in prefix=/usr and a homedir-installation of an
add-on package in prefix=$HOME leading to an m4-file
ending up in $HOME/aclocal/.

That m4 file will be invisible to aclocal, and there
is no way that a path can be added - the aclocal 1.7
does read $datadir/aclocal/dirlist but that one is
a sysadmin file as well, not a user-home related file.
So far, the aclocal tool does not read any $HOME/*rc,
and it does neither check any $ENV{ACLOCALDIR} from the
shell environment as that could extend the sysadmin's
dirlist with a userhome dirlist.

This is very inconvenient - personally, I do have a
large installation of packages in my $HOME in some
workareas around, which includes a lot of libraries
that build on each other. When doing a change with
one of these, one might want to reconf the aclocal
macros as well. That requires to run always some
`aclocal -I $HOME/aclocal` simply because there is
no environment-variable that would add this option
automatically, e.g. ACLOCAL_OPTIONS=-I $HOME/acloal

is this a bug or a feature?

It happens that there is no AUTOMAKE_OPTIONS environment
variable either to add default-options. In contrast, the
autoconf package knows a bunch of environment variables
to add/set directories that should be searched, and
the manual has an extra section about environment
variables.

-- guido  http://zziplib.sf.net
(btw, $HOME/bin/aclocal script to shadow the sysadmin's
installation is the state-of-art AFAICS)






Re: Patch to aclocal

2003-03-05 Thread Guido Draheim
Harlan Stenn schrieb:
I submitted a patch to handle this a while ago.  Don't know what happened to
it.
H

well, I think something like this needed in many places around,
it doesn't quite matter how it is called or how it works. If
the current patch proposal works, along with path-separator on
a $ACLOCALPATH, then it looks like a very good solution IYAM.
-- cheers, guido





Re: Generating man pages from other sources

2003-03-13 Thread Guido Draheim
Dalibor Topic schrieb:
Hi,

I'd like to have automake generate man pages from a
texinfo file (or DocBook, or some other format) using
a suitable tool. How could that be achieved?
I don't know what that question is asking for
actually - suffix rules are written just like
in plain make, nothing special for automake
(appart from registering the suffix but that's
 documented in `pinfo automake`).
If you are unsure as to what master format would
be used, there are a set of options. For example,
look at the gnu server for help2man which is a
perl script that turns the `program --help`
screen into a man.1 page, this is quite common
with debian packages as their policy requires a
man page for each program.
Personally, I prefer a docbook reference page
(i.e. !doctype reference) and the xml
stylesheets from docbook org along with
xmlto that builds on top of libxml/xsltproc.
That tool can be fetched from redhat servers
and  works fine with other distros as well.
The usage of xmlto hides much of the
complexity, just write a docbook reference
page and type
   xmlto man manpages.dbk
When it comes to API documentation (man 3 type)
then you could use any source extractor tool
that can generate reference-type docbook format
from source code comments. However, I'm generally
using just a few perl lines shipped along with
some of my projects to do it, it's not hard to
write such a thing (see http://zziplib.sf.net).
-- have fun, guido  http://freespace.sf.net/guidod





Re: Can I force 'make install' to depend on 'make test'?

2003-03-24 Thread Guido Draheim
install-data-local : check

That's the answer. In reality I have a last line in the configure.ac
that says echo '#' make  make check  make install - it does
remind a casual install-from-source guy to do a make check before
going to make install. 't works pretty well, since many do a simple
mousing copy-n-paste of that line in the terminal. It makes that
`make check` also optional, possibly reducing time for your own
development cycles with the same source tree (but perhaps you current
`make check` is quick enough to be run always). Instead I have a
rpm spec along that uses `make check` always - the rpm build is also
a great way to check that off-develdir builds will succeed as well.
have fun, guido

Dr. David Kirkby schrieb:
Ronald Landheer-Cieslak wrote:

Though I really don't think it's a good idea, have you tried just adding
install : check to your Makefile.am?


No, I had not, I might try that - but see below.


Anyways, the reason I don't think it's a good idea is that it will break
cross-compiling, as your test programs will probably not run on the build
host in that case..


Can you suggest a better way? I'm open to suggestions, as I'm not convinced my current method is optimal at all. I had not even considered cross-compilation issues.

In fact, I would *much appreciate* any suggestions for a better method(s). I'm sure what I am trying to do is not the best way, but don't know of a better one. 

Basically I will have several source (.c) files that will create 10 platform dependant binary files (executables). All except one of these 10 binary files are designed to quickly produce bitmaps of simple shapes. (i.e. a circle inside a rectangle, a rectangle insider a circle ...)

Next I want to check the 9 binaries are indeed producing the correct bitmaps, so I check the md5 checksum of the bitmaps produced by the 9 binaries. So a test is basically like this (must simplified)

create_bitmap_for_rect_in_rect foo.bmp
MD5SUM=`md5 foo.bmp`
if [ $MD5SUM != bdfb21053d9af7302bb587b1b89c113a ] ; then
  exit 1
else
  exit 0
fi
If the md5 checksum of the bitmaps agree with what I expect, I can assume the 9 binaries are functioning properly.

After creating the bitmaps with these 9 executables, another program (the 10th binary, called 'atlc') will run a long cpu intensive numerical simulation based on the contents of each bitmap. The output of 'atlc' consists of 6 further bitamps and a text file. 

I was expecting for the output from 10th binary (atlc) to be useful it two ways.

1) I can check the checkum of the output files, confirming atlc works okay.
2) I can install some of the files produced by atlc, for the purpose of examples. 

Hence my dilemma. It seems sensible to me that 'make install' requires that 'make check' has already been run, but I'm open to suggestions of how to be structure this. 

Should the tests just create files, check their checksum, then remove the files? Or is it better to leave the files around, so they can be installed as examples? Since I want to install these as examples and generating them is time consuming, it seems sensible to do it only once. 

Any help appreciated. 






Re: Problem with libtool

2003-06-18 Thread Guido Draheim


Marcus Bjurman schrieb:
Hi,

My project (http://www.nongnu.org/gcmd/) consists of two parts. One shared library and one ordinary executable linked agains the library. The library gets built alright but when building the executable libtool doesn't respect the $prefix that was choosen at configure-time. What I mean is that the rpath that libtool uses at link-time always gets set to /usr/local/lib regardless of the $prefix. This obviously causes a problem after the install since the executable is linked against a library in /usr/local/lib that doesn't exist.

I checked the libtool script av saw that the variable used for rpath is libdir. Checking further I saw that all makefiles have libdir setup correctly to $(prefix)/lib.

So my question basicly is: where does libtool get the libdir variable from? And then what is the correct way of getting the correct value there?

BTW: I have checked other packages such as xmms that has the same basic layout with a library and an executable. They all work so its not a problem with the installation, its rather a problem with me not being able to figure out why they work and not my project ;)

Please CC replies to me

/Marcus

I did just see a similar problem, but it was related with re'configure'ing
of the project sources with a different --prefix. It seems that the
libtool .la file is not updated with the new prefix, I had to `make clean`
before it got it right. Perhaps there's a similar thing about your stuff
here that the linker script (.la) has an old prefix from an earlier build?
have fun,
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




OT: Re: please confirm / bitte bestaetigen

2003-07-04 Thread Guido Draheim


Thomas E. Dickey schrieb:
On Fri, 4 Jul 2003, Peter Simons' Anti-Spam-Tool wrote:


- English -

Because I receive several dozen spam messages each day, I installed a
small tool that will defer incoming mail message if it comes from an
address it sees for the first time. This is the case with the message
you sent me, I'm afraid.


perhaps someone should unsubscribe him

BTW, these tools don't help much anyway - it was fashionable in the last
months to use these tools and some spammers noticed that (hey, they get
replies!). Some do autoreplies on please confirm, some others choose
forged sender adresses for one-time spam mailings. At the moment it seems
there is only one good barrier, and spam assasin is even easier to install.
cheers,
-- guidohttp://ac-archive.sf.net
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X-




Re: Filenames containing blanks

2003-07-23 Thread Guido Draheim


Ralf Corsepius wrote:
Hi,

Simple question: Does automake support filenames containing blanks?

I fear, the answer is no:

# cat Makefile.am:
data_DATA = foo\ 1
# make DESTDIR=/tmp install
..
/bin/sh ./mkinstalldirs /tmp/usr/local/share
mkdir -p -- /tmp/usr/local/share
 /usr/bin/install -c -m 644 ./foo\ /tmp/usr/local/share/foo\
/usr/bin/install: cannot stat `./foo\\': No such file or directory
 /usr/bin/install -c -m 644 ./1 /tmp/usr/local/share/1
/usr/bin/install: cannot stat `./1': No such file or directory
make[1]: *** [install-dist_dataDATA] Error 1
..
Either I must be missing something (May-be I am quoting the blank
incorrectly?) or something is broken ...
# this is what we might expect...
$ for i in hello\ 1 2 ; do echo $i ; done
hello 1
2
# this is what is done for real...
v='hello\ 1 2' ; for i in $v ; do echo $i ; done
hello\
1
2
# changing any quotes does not help...
$ v='hello\ 1 2' ; for i in $v ; do echo $i ; done
hello\ 1 2
$ v=hello\ 1 2 ; for i in $v ; do echo $i ; done
hello\ 1 2
$ v=hello\ 1 2 ; for i in $v ; do echo $i ; done
hello\
1
2
the assignment to a variable before for-in might have
been introduced to avoid problems with an empty list
which results in a syntax error on some platforms
(try `for i in; do`) but I've seen other solutions that
work just as well, like
for i in $(libs) : ; test $$i != : || continue;...
Ooops, I just had a look into am/data.am, and there is
the very comment which I've just guessed above:
## Funny invocation because Makefile variable can be empty, leading to
## a syntax error in sh.
@list='$(%DIR%_%PRIMARY%)'; for p in $$list; do \
cheers,
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: Filenames containing blanks

2003-07-23 Thread Guido Draheim


Ralf Corsepius wrote:
On Wed, 2003-07-23 at 09:39, Guido Draheim wrote:

Ralf Corsepius wrote:

Hi,

Simple question: Does automake support filenames containing blanks?

I fear, the answer is no:

# cat Makefile.am:
data_DATA = foo\ 1
# make DESTDIR=/tmp install
..
/bin/sh ./mkinstalldirs /tmp/usr/local/share
mkdir -p -- /tmp/usr/local/share
/usr/bin/install -c -m 644 ./foo\ /tmp/usr/local/share/foo\
/usr/bin/install: cannot stat `./foo\\': No such file or directory
/usr/bin/install -c -m 644 ./1 /tmp/usr/local/share/1
/usr/bin/install: cannot stat `./1': No such file or directory
make[1]: *** [install-dist_dataDATA] Error 1
..
Either I must be missing something (May-be I am quoting the blank
incorrectly?) or something is broken ...
# this is what we might expect...
$ for i in hello\ 1 2 ; do echo $i ; done
hello 1
2


the assignment to a variable before for-in might have
been introduced to avoid problems with an empty list
which results in a syntax error on some platforms
(try `for i in; do`) but I've seen other solutions that
work just as well, like
for i in $(libs) : ; test $$i != : || continue;...


Or may-be ..

test -n '$(libs)'  for i in '$(libs)'; do ...
I like that one :-) it's about as short as the current way
to guard against empty lists, but I am sure you meant not
to use single-ticks around the for-in argument, did ye.. ;-)
or
if test -n '$(libs)'; then for ...
else true; fi
I might be wrong, but IIRC automake once had used something similar to
this.
You are wrong, AFAICS, I've looked into automake 1.4-p6 and it's
about the same as the latest incarnation:
## Funny invocation because Makefile variable can be empty, leading to
## a syntax error in sh.
@list='$(@[EMAIL PROTECTED])'; for p in $$list; do \


Ooops, I just had a look into am/data.am, and there is
the very comment which I've just guessed above:
## Funny invocation because Makefile variable can be empty, leading to
## a syntax error in sh.
@list='$(%DIR%_%PRIMARY%)'; for p in $$list; do \


Still, I'd vote for a change to pick up another syntax that
does use the resp. make-var as an unquoted item...
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: Filenames containing blanks

2003-07-23 Thread Guido Draheim


Alexandre Duret-Lutz wrote:
Ralf == Ralf Corsepius [EMAIL PROTECTED] writes:


 Ralf Hi,
 Ralf Simple question: Does automake support filenames containing blanks?
I guess nobody really bothered because Make itself uses blanks
as filename separators.  '\ ' seems to be a GNU extension, does
anybody knows of another implementation where it works?  (Do not
misread this as an objection, I'm just curious to know where it
can work.)
The '\ 's are not seen by make, they are interpreted by the shell
that is invoked in the make rule executions, right? And for data,
there is rarely the occasion it occurs in a make rule header...
hmmm,
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: RFC: Building a Shared Library

2003-07-30 Thread Guido Draheim


Tim Van Holder wrote:
platform dependent parts present in the .libs subdirectory. 


(or _libs on platforms where filenames cannot start with a '.')

s|.libs|.libs/_libs|

:-)=)

btw, s|general concept|generalized concept| ?

-- guidohttp://AC-Archive.sf.net
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X-




Re: RFC: Building a Shared Library

2003-07-30 Thread Guido Draheim


Ralf Corsepius wrote:
On Wed, 2003-07-30 at 09:30, Guido Draheim wrote:

Just trying to get terminology to the point, note that developers
from other platforms will most probably have known the term
linker script, so let's expand on that knowledge without
driving away newbies.
FYI: I find the term linker script to be very confusing, because
gnu-ld uses the term linker script with a different meaning.
cf. info ld

I'd say it is similar in its meaning. Every linker uses another
linker script format, it is in no way standardized for a platform
and it may even differ from linker version to the next - unlike
object file formats or shared library binary formats. In its own
way, the libtool represents a linker using its linker script to
write up a command that actually creates the final binary.
quoting from `info ld`:
 Every link is controlled by a linker script. This script is written
 in the linker['s] command language. [...]
 You may also use linker scripts implicitly by naming them as input
 files to the linker, as though they were files to be linked.
Of course, the actual information contained in these linker library
scripts is quite different, as they are different linker tools. 8-)
YMMV, ;-)
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ d(+-) s+a- r+@+++ y++ 5++X- (geekcode)




Re: vpath builds and include paths

2003-12-22 Thread Guido Draheim


Bob Friesenhahn wrote:
On Mon, 22 Dec 2003 [EMAIL PROTECTED] wrote:


However, if I want to build in a separate build tree (relying on
VPATH), then I try the following:
mkdir /build
cd /build
/test/configure
make
This attempts to build in this new /build directory, but during
compilation it cannot locate the header file, myproj.hpp, and the rest
of the build fails.  What do I need to do in order to tell automake
where this header is?  I've already tried using the absolute path
variables in the Makefile.am:
INCLUDE = -I$(abs_top_srcdir)/project2/sublevel/inc


Note that abs_top_srcdir calculation was broken in Autoconf 2.58.  It
is fixed in 2.59.
Rather than using INCLUDE you should use AM_CPPFLAGS.  For example

AM_CPPFLAGS = -I$(top_srcdir)/project2/sublevel/inc

*g* or override DEFS to let the local includes be found first always...
since the days I have had serious problems with includes I do tend
to make up projects only ever with a prefix/my.h which needs then only
-I$(top_srcdir) where top_srcdir/prefix/ exists. That needs to override
the default_includes as well to cut away the -I. -I$(srcdir) settings.
Personally, I'd never do myproj.h anymore, only ever now myproj/inc.h
just my 2cent,
-- guidohttp://AC-Archive.sf.net
GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d(+-) r+@+++ y++ 5++X-




Re: Multiple definition of VERSION, PACKAGE and so on

2004-03-23 Thread Guido Draheim


Patrick Guio wrote:
On Mon, 22 Mar 2004, Guido Draheim wrote:


Patrick Guio wrote:

Dear all,
This is not really a bug but I wonder if you have any remedy to the
following problem. If you use autoconf/automake for several packages
which interact with each other and for each package you generate a
 ^^

configuration file config.h you migh end up with redefinition warning
message if config.h's are read together.
  ^^

... warning: PACKAGE_VERSION redefined
... warning: this is the location of the previous definition
... warning: VERSION redefined
... warning: this is the location of the previous definition
My concern is how to avoid this? Shouldn't these macro defintion be unique
for each package? Should there be a protection mechanism?
*How* do you let them interact with each other?
*Why* are config.h's read together?


What I mean by interact is that one package uses on another one :-)
I can give you an example. I have a package pack1 which I have encapsulated
in another one, pack2 (I mean that I have a subdirectory pack2 in pack1
and that configure.ac of pack2 has a AC_CONFIG_SUBDIRS(pack1) command).
I have choosen this architecture since pack2 (plasma simulator) builds on
top of pack1 (pde solver) and pack1 could be used on its own or for other
projects.
Until recently, I didn't use the command AC_CONFIG_HEADERS and I didn't
have any problem since the macros definition -DPACKAGE -DVERSION were
option commands and put just once per compiler command.
Now I wanted to start using AC_CONFIG_HEADERS because I have too many
defs, so I have just a AC_CONFIG_HEADERS([pack1-config.h])
and AC_CONFIG_HEADERS([pack2-config.h]) for each package.
The pack1-config.h is included only if HAVE_CONFIG_H is defined and is
included in my definition file for the package (C++ typedefs for template
arrays (double/float), constant defs, dimensionnality of the problem, MPI
use, FFTW use, HDF use, etc...). Each class declaration and definition
needs this file. The same is done in pack2 but some classes of pack2 uses
public methods of pack1 classes. Therefore in some classes definition of
pack2 I have to include the class declaration of some peculiar classes from
pack1. The result is that the pack1-config.h and pack2-config.h are both included.
Now I don't really see how I can avoid that. But you may have suggestions?

As said before - it is TOTAL ERROR to include config.h (here: pack1-config.h)
into header files made publically available to outside projects (here: pack2).
as a solution, instead of current usage, modify pack1/configure.ac to use
  AC_CONFIG_HEADER
  AX_PREFIX_CONFIG_HEADER([pack1-config.h])
have a look at the resulting `pack1-config.h` as the resulting #define's
are named slightly different now (the prefix ye know). Wherever you have
been using autoconfigured #ifdef MYDEFs in your header files, change those
#ifdef PKG_MYDEFs i.e. the new (prefixed) define-names. And all ye problems
will vanish. *plop*
That's it. Good luck, guido http://ac-archive.sf.net



Sincerely
Patrick




To answer right away on a frequently asked question: do not install
config.h and do not #include it in public headers of a (sub)project.
Use ax_prefix_config_h to avoid problems with ac_define's
occuring in two autoconfigured header files instead.
btw, - autoconf people, may be we should move the prefix'config
macro to main autoconf and/or reference it in main documentation?



--
-- guido  http://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d(+-) r+@+++ y++ 5++X- (geekcode)




Re: Multiple definition of VERSION, PACKAGE and so on

2004-03-24 Thread Guido Draheim


Patrick Guio wrote:
On Tue, 23 Mar 2004, Guido Draheim wrote:


What I mean by interact is that one package uses on another one :-)
I can give you an example. I have a package pack1 which I have encapsulated
in another one, pack2 (I mean that I have a subdirectory pack2 in pack1
and that configure.ac of pack2 has a AC_CONFIG_SUBDIRS(pack1) command).
I have choosen this architecture since pack2 (plasma simulator) builds on
top of pack1 (pde solver) and pack1 could be used on its own or for other
projects.
Until recently, I didn't use the command AC_CONFIG_HEADERS and I didn't
have any problem since the macros definition -DPACKAGE -DVERSION were
option commands and put just once per compiler command.
Now I wanted to start using AC_CONFIG_HEADERS because I have too many
defs, so I have just a AC_CONFIG_HEADERS([pack1-config.h])
and AC_CONFIG_HEADERS([pack2-config.h]) for each package.
The pack1-config.h is included only if HAVE_CONFIG_H is defined and is
included in my definition file for the package (C++ typedefs for template
arrays (double/float), constant defs, dimensionnality of the problem, MPI
use, FFTW use, HDF use, etc...). Each class declaration and definition
needs this file. The same is done in pack2 but some classes of pack2 uses
public methods of pack1 classes. Therefore in some classes definition of
pack2 I have to include the class declaration of some peculiar classes from
pack1. The result is that the pack1-config.h and pack2-config.h are both included.
Now I don't really see how I can avoid that. But you may have suggestions?

As said before - it is TOTAL ERROR to include config.h (here: pack1-config.h)
into header files made publically available to outside projects (here: pack2).


I would definitely like to hear how do you handle the following solution
if you mean this is TOTAL ERROR to include config.h into header files
that are made publically available to outside projects.
As far as now I have a pack1-defs.h header file containing definitions like:
#if defined(HAVE_CONFIG_H)
#include pack1-config.h
#endif
#if defined(FLOAT_FIELD)
typedef float pack1Real;
#elif defined(DOUBLE_FIELD)
typedef double pack1Real;
#else
#error macro FLOAT_FIELD or DOUBLE_FIELD must be defined
#endif
typedef blitz::Arraypack1Real,1 Array1dr;
typedef blitz::Arraypack1Real,2 Array2dr;
typedef blitz::Arraypack3Real,3 Array3dr;
typedef blitz::Arraypack1Real,DIM Field;

I need to make this file publically available to outside projects if I
don't want to redefine type Field for example. But to do so I have to
include pack1-config.h as well since definition FLOAT_FIELD or DOUBLE_FIELD
needs to be defined.
That's the way we handle definitions in the Blitz++ project as well so if
you have a better solution let me know!


as a solution, instead of current usage, modify pack1/configure.ac to use
  AC_CONFIG_HEADER
  AX_PREFIX_CONFIG_HEADER([pack1-config.h])


This does not work with autoconf-2.59/automake-1.8.3.

% autoreconf -vif

autoreconf: Entering directory `mudfas'
autoreconf: running: libtoolize --copy --force
Putting files in AC_CONFIG_AUX_DIR, `config'.
autoreconf: running: /usr/local/gnu/bin/autoconf --force
autoreconf: running: /usr/local/gnu/bin/autoheader --force
autoheader: error: AC_CONFIG_HEADERS not found in configure.ac
autoreconf: running: automake --add-missing --copy --force-missing
configure.ac:10: not enough arguments for AC_CONFIG_HEADERS
Still a new configure is generated. But then when running configure

./configure: line 1580: syntax error near unexpected token
`src/mudfas-config.h'
./configure: line 1580: `AX_PREFIX_CONFIG_HEADER(src/mudfas-config.h)'
It looks like AC_CONFIG_HEADER is enough (whithout S). So you don't need
to have the whole keyword but a long enough one to be unambiguous?
nope, they are different - I didn't knew the *S word has been made to
actually require an argument. Originally, we had only the version without
an S and only one config header file was expected to be used.
autoconf-2.13]$ grep AC_CONFIG_HEADER *.m4
acgeneral.m4:dnl AC_CONFIG_HEADER(HEADER-TO-CREATE ...)
acgeneral.m4:AC_DEFUN(AC_CONFIG_HEADER,
acgeneral.m4:dnl Support passing AC_CONFIG_HEADER a value containing shell variables.
autoheader.m4:define([AC_CONFIG_HEADER], [#

I have another remark. Is the use of AC_CONFIG_HEADER expected to have the
same behaviour as passing argument on line? (Because it is not
necessary the same behaviour as said in an earlier email)
using a macro name without ()-parentheses or empty ()-parentheses
is equivalent. I do not know if that was the question but let that be an
answer right here ;-)
* now, going back to earlier items in the mail
 ./configure: line 1580: syntax error near unexpected token
 `src/mudfas-config.h'
 ./configure: line 1580: `AX_PREFIX_CONFIG_HEADER(src/mudfas-config.h)'
you need to download the ax_prefix_config_header.m4 file and put the
content in your acinclude.m4 - the macro is not (yet) shipped with
core autoconf. The AX* hints for macros from

Re: How to setup an example programs subdirectory?

2005-01-02 Thread Guido Draheim
DIST_SUBDIRS = lib bin examples
SUBDIRS = lib bin @EXAMPLES@

Simon Perreault wrote:
 Hi,
 
 I have a question for which I haven't been able to find an answer on my own, 
 using the usual resources (manual, google, etc).
 
 My project uses automake and I want to have a directory containing example 
 programs. These programs should not be built on a simple make, but could be 
 built on a make examples directive. How can I handle that?
 
 Thanks!
 




Re: ax_prog_csc / Re: C# support for automake

2005-12-21 Thread Guido Draheim
Ouch, just saw that You wrote the original csharpcomp.sh. Well,
just consider it as an update from a current project ;-)

Guido Draheim wrote:
 
 Bruno Haible wrote:
 Guido Draheim wrote:
 create a .NET wrappers for the linux dvb kernel api. It does
 work - getting libtool to compile a native shared library
 being called from a managed dll that imports symbols from it.
 Which are the command lines that you use for doing this? I'd like to
 understand which tools are used for which step, before thinking about
 Makefile variables and automake.

 
 There is absolutely no need to use autotools to create a unixish
 csharp wrapper pair of *.so/*.dll libraries. It just makes it so
 easy - with a few lines of autoconf/automake code we get a slew of
 portabilitiy tests to create the native library part via a
 configure-generated libtool.sh shell script. Just as well my
 autoconf macro creates a configure-generated shell script named
 csharpcomp.sh but that is just converting options for different
 csharp compilers that may be used (Microsoft, portableNET, Mono).
 
 If you do have your own build system then it is about easy to
 pick up the command line conversion in those shell scripts. The
 gcc usually just requires -shared and -l imports
gcc -shared *.c -lzip -o libproject.so
 and the mono compiler wants -target:Library with -reference:s
gmcs -target:Library -reference:Mono.Posix *.cs -out:Project.dll
 That's pretty much about it. If you do compile an example project
 then you will see the shell scripts (libtool / csharpcomp) logging
 the converted command lines to the terminal. Just check it out.
 
 have fun,
 (attaching my current csharpcomp.sh)
 -- guidohttp://google.de/search?q=guidod
 GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d-++ r+@+++ y++ (geekcode)

-- 
-- guidohttp://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d-++ r+@+++ y++ (geekcode)




Re: ax_prog_csc / Re: C# support for automake

2005-12-21 Thread Guido Draheim


Bruno Haible wrote:
 Guido Draheim wrote:
 create a .NET wrappers for the linux dvb kernel api. It does
 work - getting libtool to compile a native shared library
 being called from a managed dll that imports symbols from it.
 
 Which are the command lines that you use for doing this? I'd like to
 understand which tools are used for which step, before thinking about
 Makefile variables and automake.
 

There is absolutely no need to use autotools to create a unixish
csharp wrapper pair of *.so/*.dll libraries. It just makes it so
easy - with a few lines of autoconf/automake code we get a slew of
portabilitiy tests to create the native library part via a
configure-generated libtool.sh shell script. Just as well my
autoconf macro creates a configure-generated shell script named
csharpcomp.sh but that is just converting options for different
csharp compilers that may be used (Microsoft, portableNET, Mono).

If you do have your own build system then it is about easy to
pick up the command line conversion in those shell scripts. The
gcc usually just requires -shared and -l imports
   gcc -shared *.c -lzip -o libproject.so
and the mono compiler wants -target:Library with -reference:s
   gmcs -target:Library -reference:Mono.Posix *.cs -out:Project.dll
That's pretty much about it. If you do compile an example project
then you will see the shell scripts (libtool / csharpcomp) logging
the converted command lines to the terminal. Just check it out.

have fun,
(attaching my current csharpcomp.sh)
-- guidohttp://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d-++ r+@+++ y++ (geekcode)


csharpcomp.sh
Description: application/shellscript


/my/bin/aclocaldir

2006-05-23 Thread Guido Draheim
I did once use an 'acinclude' tool to help copying macros from my
personal ac-archive to the various projects. The name stems from
writing out to acinclude.m4. It was based on an old version of
'aclocal'. It had too much difference to care for an updated version.

In the meantime we find that 'aclocal' is happy to m4_include
macros from a local subdirectory m4/. So it is a better way
to populate this subdirectory from the ac-archive. This time
I am making it up as a patch over standard 'aclocal' - it just
adds a few options that imitate automake options -a and -c.
As a nicety a tool 'aclocaldir' ending with '*dir' will imply -a.

It works unexpectingly well for the small size of the patch.
I can add -I /my/ac-archive to get all the ax_* macros and if
I do forget about the path (or work on a system without one)
it will only update the automake macros being used with leaving
the extra ax-macros as they had been distributed. The standard
aclocal.m4 shrinks to a few lines.

have fun,
-- guidohttp://google.de/search?q=guidod
GCS/E/S/P C++/$ ULHS L++w- N++@ s+:a d-++ r+@+++ y++ (geekcode)
--- /usr/bin/aclocal	2006-04-23 02:59:36.0 +0200
+++ /my/bin/aclocaldir	2006-05-24 03:02:21.0 +0200
@@ -59,7 +59,18 @@
 my $configure_ac;
 
 # Output file name.
-$output_file = 'aclocal.m4';
+my $output_file = 'aclocal.m4';
+
+# Output directory name.
+my $output_dir = 'm4';
+
+# TRUE if missing macro files should be installed.
+my $add_missing = 0;
+
+# TRUE if we should copy missing files; otherwise symlink if possible.
+my $copy_missing = 0;
+
+$add_missing = 1 if $0 =~ /dir$/;
 
 # Modification time of the youngest dependency.
 $greatest_mtime = 0;
@@ -455,6 +466,7 @@
 {
   my ($output_file, @macros) = @_;
   my $output = '';
+  my @outputfiles = ();
 
   my %files = ();
   # Get the list of files containing definitions for the macros used.
@@ -490,12 +502,47 @@
 	  # Otherwise, simply include the file.
 	  $output .= m4_include([$file])\n;
 	}
+  push @outputfiles, $file;
 }
 
   # Nothing to output?!
   # FIXME: Shouldn't we diagnose this?
   return if ! length ($output);
 
+  if ($add_missing || $copy_missing) 
+{
+  print STDERR aclocal: writing to $output_dir\n if $verbose;
+  for my $file (@outputfiles)
+	{
+	  my $fullfile = File::Spec-canonpath ($output_dir/ 
+		. basename ($file));
+	  # Install the missing file.  Symlink if we
+	  # can, copy if we must.  Note: delete the file
+	  # first, in case it is a dangling symlink.
+	  $message = installing `$fullfile';
+	  # Windows Perl will hang if we try to delete a
+	  # file that doesn't exist.
+	  unlink ($fullfile) if -f $fullfile;
+	  if (! $copy_missing)
+	{
+	  print STDERR aclocal: symlink '$fullfile' - '$file'\n 
+		if $verbose;
+	  if (! symlink ($file, $fullfile))
+		{
+		  print $message, ; error while making link: $!;
+		}
+	}
+	  else
+	{
+	  print STDERR aclocal: create '$fullfile' - '$file'\n 
+		if $verbose;
+	  my $out = new Automake::XFile  $fullfile;
+	  print $out $file_contents{$file};
+	}
+	}
+  return;
+}
+
   # We used to print `# $output_file generated automatically etc.'  But
   # this creates spurious differences when using autoreconf.  Autoreconf
   # creates aclocal.m4t and then rename it to aclocal.m4, but the
@@ -555,10 +602,13 @@
 Generate `aclocal.m4' by scanning `configure.ac' or `configure.in'
 
   --acdir=DIR   directory holding config files
+  -a, --add-missing add missing macro files to package
+  -c, --copycopy missing files (default is symlink)
   --helpprint this help, then exit
   -I DIRadd directory to search list for .m4 files
   --force   always update output file
   --output=FILE put output in FILE (default aclocal.m4)
+  --output-dir=DIR  put output in DIR (default m4/)
   --print-ac-dirprint name of directory holding m4 files
   --verbose don't be silent
   --version print version number, then exit
@@ -585,6 +635,18 @@
 	{
 	  $output_file = $1;
 	}
+  elsif ($arglist[0] =~/^--output-dir=(.+)$/)
+	{
+	  $output_dir = $1;
+	}
+  elsif ($arglist[0] eq '--add-missing' || $arglist[0] eq '-a')
+	{
+	  $add_missing = 1;
+	}
+  elsif ($arglist[0] eq '--copy' || $arglist[0] eq '-c')
+	{
+	  $copy_missing = 1;
+	}
   elsif ($arglist[0] eq '-I')
 	{
 	  shift (@arglist);


Re: /my/bin/aclocaldir

2006-05-24 Thread Guido Draheim


Stepan Kasal wrote:
 Hello,
 
 On Wed, May 24, 2006 at 03:17:23AM +0200, Guido Draheim wrote:
 [...] I am making it up as a patch over standard 'aclocal' - it just
 adds a few options that imitate automake options -a and -c.
 
 the CVS version of aclocal has an option --install which is very
 similar to this.  Does it meet your needs?
 

Almost.
(a) Can't go and install cvs version everywhere.
Some patch over current distributed automake is a nice thing.
(b) Cvs version does a plain copy of the file - a symlink does
tell me which version was picked / where it is / and makes
faster turnarounds when making up ac macros.
(c) Looking at the cvs version I doubt that it will work as
there is a missing m4_include in the $install else path.

have fun,
-- guidohttp://google.de/search?q=guidod




Re: RFE: allow for computed version number

2009-05-24 Thread Guido Draheim
Hint: ALL my packages derive the version number at
configure time by reading an external file, e.g.
VERSION, xxxrpm.spec or the version repository. Everything
else is being hacked around whatever new pecularities
some autoconf/automake release may have. Whatever lifts
the requirement on hacking is a good approach.

Good luck, Guido


Bruno Haible schrieb:
 Hi all,
 
 It has been mentioned in a discussion [1][2][3]
   In the medium to long run, Autoconf should be changed to not depend
at autoconf run time upon a volatile version string.
 and
   the goal is that the version string should _not_ appear in
config.h, so there should be _no_ configure output that changes in content
due to a version string change.
 
 As a first step in this direction, I'd like to propose a change to
 AM_INIT_AUTOMAKE that makes it possible to explore various approaches
 to this problem.
 
 Table of contents:
 1) Why some people want computed version numbers
 2) AC_INIT, AM_INIT_AUTOMAKE and the version number
 3) Why AC_INIT(PACKAGE, VERSION) is bad
 4) Temporary hack #1
 5) Temporary hack #2
 6) Proposal for AM_INIT_AUTOMAKE
 
 
 Why some people want computed version numbers
 =
 
 At the time autoconf was designed, network bandwidth was limited, so
 package maintainers created new releases once in ca. 6 months on average.
 Using autoconf to propagate the version number into every Makefile seemed
 adequate, since the version number did not change often.
 
 Nowadays, it is customary to make prereleases on a weekly or more frequent
 basis. So the version number changes more often.
 
 For users of a distributed VCS, such as 'git', a new release is effectively
 published each time the developer does a git commit; git push. Thus the
 version number can easily change 20 times a day.
 
 Now, if the propagation chain of the version number into the compiled programs
 goes like this:
 
 configure.ac or version.sh or .git/*
  |
  | aclocal, autoconf
  V
 configure
  |
  | ./configure or ./config.status --recheck
  V
 config.status
  |
  | ./config.status
  V
 Makefile, config.h
  |
  | make
  V
 program.o
  |
  | make
  V
 program
 
 a change in the version number will cause a modification to 'configure',
 then a modification to 'config.status',
 then a modification to 'config.h',
 then a modification to all object files,
 then a relinking of all programs.
 
 In other words, the entire package is rebuilt. This is acceptable once in 6
 months, but not 20 times a day. It just gets in the way of the developer.
 
 So people want to store the version number in a file other than configure.ac,
 so that they control the dependencies.
 
 configure.ac version.sh or 
 .git/*
  ||
  | aclocal, autoconf  |
  V|
 configure |
  ||
  | ./configure or ./config.status --recheck   |
  V|
 config.status |
  ||
  | ./config.status|
  V|
 Makefile, config.h|
  ||
  | make   |
  V|
 program.o version.o
  ||
  | make   |
  VV
 program   program
 
 In this scenario, when the version number changes, a single file version.c
 is recompiled, all programs are relinked, and the documentation is rebuilt.
 
 Developers who don't care about the output of program --version on their
 own system may reduce this propagation even more: They will typically only
 make the 'info' formatted documentation depend on the version file (since
 these are the only generated files that contain a version number and which
 are distributed by make distrib).
 
 So, through cutting these 

Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Guido Draheim
 
 
 [About GNU make]
 
  GNU make is a basic component of the GNU system, actively maintained and
  developed, well documented, and required by other very important projects
  (Linux Kernel and Git DVCS, for example).
 
  GNU make is very portable, easy to compile, and fully bootstrapping (its
  build system does not require a pre-existing make installation).
  
  GNU make is the default make program on GNU/Linux (okay, we're in
  full platitude land here, but I value completeness in this issue).
  
  GNU make is readily available in the FreeBSD port collection (and it's
  required by many other ports to compile, see
http://www.freebsd.org/doc/en/books/developers-handbook/tools-make.html
  for more info).
 
  GNU make is available as a NetBSD package, for many architectures
  and versions; for example:
   
 ftp://ftp.netbsd.org/pub/pkgsrc/packages/NetBSD/powerpc/4.0/devel/gmake-3.81.tgz
   
 ftp://ftp.netbsd.org/pub/pkgsrc/packages/NetBSD/i386/5.0/devel/gmake-3.82nb1.tgz
   
 ftp://ftp.netbsd.org/pub/pkgsrc/packages/NetBSD/sparc64/5.1/devel/gmake-3.81.tgz
  
 
  GNU make should also be available as a Solaris precompiled package:
http://www.sun.com/software/solaris/freeware/
  or as an unofficial pre-compiled binary:
http://www.sunfreeware.com/programlistintel10.html.
 
  In conclusion, it's not unresonable to expect that a user ready to
  compile a package from sources will also be ready and capable to obtain,
  compile and install a non-ancient version of GNU make.
 

Hi folks,

I haven't seen a reference to non-replace makefile-engines.

In reality all the previous makefile-systems (shipping along Solaris,
AIX, etc) have been subverted by GNU-Make - in almost any company that
I came across they had installed gmake in parallel with the native
makefile-system. And in the OpenSource world one can see many BSD packages
that have a build dependency on gmake these days.

However there are systems where gmake can not be run as a substitute,
this is atleast true for
* CmSynergy - ships objectmake (omake) integrated with the build system.
* ClearCase - ships clearmake integrated with the build systems.
and there are sure some other build systems that have a frontend to
the user allowing for makefiles as the backend.

These alternate makefile-systems had been modelled after standard
make plus some extensions which are usually compatible with GNU
Make. GNU Make has set the standard but GNU Make is NOT FREE to be
used as the backend of these non-GPL makefile-systems. Cough up a
free libgmake and you can assume its standard will be available
everywhere in a short time. But that's not how it is.

cheers, Guido




Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Guido Draheim
Am 12.01.2011 19:01, schrieb Stefano Lattarini:
 
  GNU make is a basic component of the GNU system, actively maintained and
  GNU make is very portable, easy to compile, and fully bootstrapping
  GNU make is the default make program on GNU/Linux (okay, we're in
  GNU make is readily available in the FreeBSD port collection (and it's
  GNU make is available as a NetBSD package, for many architectures
  GNU make should also be available as a Solaris precompiled package:
 
  In conclusion, it's not unresonable to expect that a user ready to
  compile a package from sources will also be ready and capable to obtain,
  compile and install a non-ancient version of GNU make.
 
  I don't think that requiring GNU make (not a specific version, just
  a less-than-fifteen-years old one) is gonna be an harsh or unreasonable
  requirement.  And the gains in terms of maintainability, testability,
  and possibility of optimization are obvous.
 
 

Sure GNU Make has subverted most other makefile-systems but there are places
where it CAN NOT be used as a drop-in replacement. The paramount examples are
in larger build-systems with an integrated makefile-system - some IDEs for
embedded targets may have it but I do more think of

* CmSynergy - ships with omake (object make) integrated with version control
* ClearCase - ships with clearmake integrated with its version control

That's more than a third of the version control systems used in the
corporate world. So unless automake would officially dump the support
for the corporate world (which some GNU roughnecks might argue for) it
would be good to NOT allow GNUMake features all around.

Note however that both the systems above do have some extensions over
standard make that are always compatible with GNU Make - simply because
GNU Make has set the de facto standard. So it would be very fair to
assume a specific set of features on top of standard make that are
prevalent in modern makefile-system all around.

But GNU Make on its own has a very large feature-set that is strong enough to
be a build-system on its own. You don't need much of autoconf/automake anymore!!
And this is not just theory because I have already created such a system for
a company - you can run configure-tests with make rules in GNUmake. That's 
because
you can include make-generated makefile-snippets at runtime which is impossible
for most other makefile systems. (originally that feature was meant to allow
make deps generating *.dep's and compile the final target in one make-call. 
Here
you can use it to print defines to a makefile-snippet and load it right away.)

cheers, Guido



Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Guido Draheim
Am 13.01.2011 19:50, schrieb Bob Friesenhahn:
 On Thu, 13 Jan 2011, Guido Draheim wrote:
 * ClearCase - ships clearmake integrated with the build systems.
 and there are sure some other build systems that have a frontend to
 the user allowing for makefiles as the backend.
 
 FYI, ClearCase's clearmake is actually based on GNU make source code. At
 least it was back when I was using it.

Well, I had years to study the differences and there are many. What you
do think of is -C gnu which disables any GNU-incompatible features, it
defines some GNU-Make compatible default-rules and understands some more
GNU-Make-like options. But I would think clearmake is GNU-make based.
Sure however I do not know how the gnu emulation is implemented :-/