Re: Problem with circular dependencies
Jules Colding cold...@venalicium.dk writes: Anyways, thanks for the advises. I'll read the paper on evil recursiveness and ponder what to do. Non-recursive make is indeed a nice paradigm for various reasons, but if you've read earlier threads on this list, actually _implementing_ it is not quite that trivial. You're essentially using a global namespace for everything, and you've got to write your makefiles with that in mind, and it can cause some wrinkles and inconveniences. There are tradeoffs... -miles -- Cannon, n. An instrument employed in the rectification of national boundaries.
Re: Issues with subdir-objects and differing versions of automake
So what do you think? Is my characterization reasonable/unreasonable, am I missing something, ... or ...? Any other opinions? -miles -- `Suppose Korea goes to the World Cup final against Japan and wins,' Moon said. `All the past could be forgiven.' [NYT]
Re: Issues with subdir-objects and differing versions of automake
Stefano Lattarini stefano.lattar...@gmail.com writes: It seems to me that the situation there has improved a lot in the recent years, to the point that recursive and non-recursive build support is almost on-par (and the non-recursive option is the recommended one). Or are you referring to documentation issues rather the coding/design ones? Well I suppose it could entirely be a documentation issue (or an I'm just blind/stupid issue :) ... even the best support in the world can't help you if you don't know about it ... :] I have searched a little bit for people's automake-based non-recursive solutions, and the impression I've gotten is that while it's certainly doable, it's more fiddly and less straight-forward than the traditional recursive method. [I'm also using Debian unstable's automake (v 1.13.3), so I can't easily use features that are only in very recent automake versions, e.g. %reldir%.] In part this is because automake makes recursive makefiles _so_ easy and clean -- all you have to do is provide SUBDIRS, and then you get a per-directory local space to play in, and can just write your Makefile.am locally without worrying about other directories except when you want to. With non-recursive Makefiles, on the other hand, you have a single global namespace, even if you split the actual rules into included per-subdirectory Makefile.am fragments. Like all global namespaces, this means you need to worry about conflicts and interactions etc, and need to write everything from a global perspective. In particular: 1. Even when make-fragments are physically located in subdirectories, their rules still need to state everything globally, e.g., one must write libblargh_a_SOURCES = blargh/file1.c blargh/file1.c... Not a huge deal for very small libraries, but decidedly bloated if there are 50 files in a library. This hold for pretty much every rule... [If one has access to newer versions of automake, one can write e.g. %D%/file1.c instead, but this is still pretty cumbersome and ugly.] 2. You can't just change AM_CFLAGS to add special changes appropriate for a subdirectory, you need to use per-library CFLAGS, which comes with its own annoyances. In particular, it results in ugly prefixes being added to the names of .o files even when not necessary... my .o file names end up being like 25 characters long as a result! None of this is _fatal_, it's still obviously possible to just bite the bullet, jump through the hoops, and get a working non-recursive build -- but neither is it particularly nice... One of automake's great strengths, traditionally, is the ability it gives one to write straight-forward and highly readable/maintainable Makefile.ams, without all the boilerplate and nonsense required for traditional Makefiles. However using the non-recursive style with current automake seems in some ways a step backwards, requiring more boilerplate and more fiddliness, ceding some of automake's traditional advantages... When I recently completely revamped my project's source tree, I opted for subdirectories with traditional recursive automake -- it was just so _easy_, whereas non-recursive automake, well, that seemed to require thought and care. I hope I can switch to non-recursive style in the future when I have some time to play with it, but it seems less straight-forward and more work. As for how to _address_ these issues, I dunno, ... ideally, I'd like to be able to think locally when writing a subdirectory Makefile.am, even if the final build mechanism ends up being a single global Makefile. Maybe it's too hard to fully automate that, but still, better tools for writing Makefile.am fragments would still be useful. E.g., automatically add %RELDIR% to all relative filenames in this [Makefile.am fragment] file, and maybe cleaning up library-CFLAGS to get rid of the need for prefixes (If I'm using subdir-objs anyway, why should I need prefixes?!). -miles -- 永日の 澄んだ紺から 永遠へ
Re: [OMPI devel] GNU Automake 1.14 released
Jeff Squyres (jsquyres) jsquy...@cisco.com writes: We've been using sym links in the OMPI project for years in order to compile a series of .c files in 2 different ways. It's portable to all the places that we need/want it. Hmm, how about just cp ...? :] -miles -- 80% of success is just showing up. --Woody Allen
Re: Issues with subdir-objects and differing versions of automake
Diego Elio Pettenò flamee...@flameeyes.eu writes: I would also argue for just using non-recursive automake, but it might be the least of your problems for now. Just is probably not the right term, as it generally seems to require more work to make a good non-recursive build setup with automake, even if it's functionally superior in the long run. Automake's support for the recursive style is pretty good, whereas it's support for non-recursive subdirs is at best, kind of sketchy... -miles -- Quotation, n. The act of repeating erroneously the words of another. The words erroneously repeated.
Re: Micro releases and testsuite work
Peter Rosin p...@lysator.liu.se writes: noone has bothered to check, a user on that platform is going to be less than impressed and would probably not trust the new micro version and would thus be inconvenienced indeed. I think 99.99% of users who download and build packages don't run the test-suite... :] [Big users like distro maintainers probably do, but they also are more likely to follow up on problems (report the bug, debug it, etc), and realize that it's just a test-suite issue.] -miles -- One of the lessons of history is that nothing is often a good thing to do, and always a clever thing to say. -- Will Durant
Re: Where Do All These Macros Come From?
Paul Smith psm...@gnu.org writes: Other macros might have been created specifically for a given project; they will be contained in files in that project directory. Other macros might be defined by other 3rdparty software you are trying to work with (texinfo, various libraries, etc.) Those will be defined by those packages. ... a widely-used example of this is the PKG_CHECK_MODULES macro, which is defined by the pkg-config package. -miles -- Scriptures, n. The sacred books of our holy religion, as distinguished from the false and profane writings on which all other faiths are based.
Re: What gets distributed in presence of conditional statements?
Andrey Borzenkov arvidj...@gmail.com writes: Alternative is to place EXTRA_DIST outside of conditionals, but this is more intrusive probably. Why do you think it would be more intrusive? The only function of EXTRA_DIST is to specify files that should be put in the tarball, for files which aren't automatically included... From your description, it sounds like you want these files to always be distributed, so unconditionally adding them to EXTRA_DIST sounds perfect. -miles -- Neighbor, n. One whom we are commanded to love as ourselves, and who does all he knows how to make us disobedient.
Re: Creating plain Makefiles with automake
Marko Kreen mark...@gmail.com writes: My experience adding custom make rules to Makefile.am (which AFAICT, mostly just passes them through) is that I typically don't need to use obscure features, mostly it's just bog-standard make rules with some shell-scripting in the build recipe. Well, user might want few of following features for own rules: - out-of-tree builds - cross-compilation - nice output - automatic deps In many cases, automake features work fine even for custom rules... :] E.g., you're free to use $(srcdir), $(AM_V_CC), etc, in your rules. -miles -- values of β will give rise to dom!
Re: Creating plain Makefiles with automake
Marko Kreen mark...@gmail.com writes: IMHO, building via portable makefiles is bad idea. For quite simple reason - user rules. Yes, you can create some default targets that are generated for user, portably. But as soon as user needs to write own rules or even write logic in makefile, then what? It depends. automake passes through user rules, but automake takes care of 99% of the really hairy stuff which might ordinarily tempt one to use non-portable constructs. My experience adding custom make rules to Makefile.am (which AFAICT, mostly just passes them through) is that I typically don't need to use obscure features, mostly it's just bog-standard make rules with some shell-scripting in the build recipe. -miles -- We have met the enemy, and he is us. -- Pogo
Re: Creating plain Makefiles with automake
The current _user-interface_ (./configure ...ARGS...; make) also has the advantage of making it pretty clear where one specifies configuration options, and retaining those options during normal rebuilds. With a make-only approach, where do I specify configuration options? make OPT1=yes ...? If so, do I need to re-specify those arguments every time I invoke make? If not (if they're somehow magically recorded somewhere), do options specified to different make invocations accumulate? If so, how do I reset them? ...etc etc... The configure/build split actually seems pretty useful... [Having a way of making a default configure happen automatically if one just does make without configure would be useful, I suppose, but that probably just needs a trivial little Makefile which get overwritten by configure.] -miles -- Omochiroi!
bug#13578: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: So we should maybe go (after the next major release) with this naming scheme for the branches? * maint - for next micro version * stable - for next minor version * master - for next major version That seems to match common practice, insofar as I understand it... [Another consideration is whether you have a single named branch for maintenance (e.g. maint, and stable), or just use version-named branches (and thus can maintain multiple versions simultaneously).] -miles -- Future, n. That period of time in which our affairs prosper, our friends are true and our happiness is assured.
Re: bug#13578: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: So we should maybe go (after the next major release) with this naming scheme for the branches? * maint - for next micro version * stable - for next minor version * master - for next major version That seems to match common practice, insofar as I understand it... [Another consideration is whether you have a single named branch for maintenance (e.g. maint, and stable), or just use version-named branches (and thus can maintain multiple versions simultaneously).] -miles -- Future, n. That period of time in which our affairs prosper, our friends are true and our happiness is assured.
bug#13578: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: And while you *might* have changed my mind before (because you have valid points, and maybe it would have better to err on the side of safety), I have now already rewritten maint, so rather than messing up by rewriting it again (to its old value, granted, but a rewrite nonetheless) and reverting an already made decision (and made after considerable discussion and not negligible efforts), I'd rather stuck with the current minor mess. Rewriting to the old value makes a _huge_ difference (at least with git), because people that haven't done a pull or whatever of the new value will then have no problem at all. So whether another rename causes more or less pain depends on what proportion of people that have a pointer to your repository do frequent updates. As automake seems to be the sort of project that has mostly casual contributors, I'd wager many people _haven't_ pulled the changed version, and so would be _helped_ by a rename-to-the-old-value (and hurt by not doing so). -miles -- Arrest, v. Formally to detain one accused of unusualness.
Re: bug#13578: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: You might have good points, and possibly even be completely right... But I must ask, why didn't you step up during the lengthy discussion about this change, nor objected during the delay (almost a week) that was deliberately let pass between the decision and the implementation -- precisely to let this kind of late objections to come out? I just didn't notice the name change... -miles -- Egotist, n. A person of low taste, more interested in himself than in me.
Re: bug#13578: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: And while you *might* have changed my mind before (because you have valid points, and maybe it would have better to err on the side of safety), I have now already rewritten maint, so rather than messing up by rewriting it again (to its old value, granted, but a rewrite nonetheless) and reverting an already made decision (and made after considerable discussion and not negligible efforts), I'd rather stuck with the current minor mess. Rewriting to the old value makes a _huge_ difference (at least with git), because people that haven't done a pull or whatever of the new value will then have no problem at all. So whether another rename causes more or less pain depends on what proportion of people that have a pointer to your repository do frequent updates. As automake seems to be the sort of project that has mostly casual contributors, I'd wager many people _haven't_ pulled the changed version, and so would be _helped_ by a rename-to-the-old-value (and hurt by not doing so). -miles -- Arrest, v. Formally to detain one accused of unusualness.
bug#13578: [IMPORTANT] Savannah issues
Just that by far the most common branch setup in git repos seems to be using master as the dev trunk, with releases, release candidates (etc) on special branches. There are often additional feature branches for even more speculative changes, but master is generally not really safe, even if it's not the most dangerous branch. So master tends to be a sort of middle state (between release/release-candidate branches and speculative feature branches), stuff that is slated for the next release, and has received review -- but may still have some bugs to be shaken out. For complicated long-term changes, people often do development on special feature branches, but smaller and more straight-forward changes generally get put into master directly. Master branches break with some regularity. If you're familiar with gcc's subversion repo setup, it's pretty similar to this (with the subversion trunk being master) Git's own repo actually does this too. Thanks, -miles -- Cat is power. Cat is peace.
Re: [IMPORTANT] Savannah issues
Stefano Lattarini stefano.lattar...@gmail.com writes: * maint - master * master - next Damn, not really. For some questionable reason, Savannah is rejecting my non-fast-forward push to master even if I specify '--force', and I cannot use the usual trick delete the remote branch, then push the local one to it trick that I typically use to work around this problem, since 'master' is the current branch of the remote repository, and that cannot be deleted to avoid confusing git clone. So *THE AUTOMAKE GIT REPOSITORY ON SAVANNAH IS CURRENTLY IN AN INCONSISTENT STATE* (not broken, mind you, merely inconsistent with our new declared policies), and should not be used until this issue is resolved. I don't have time to look into this presently, I had time today, so I submitted a Task in the Savannah interface: https://savannah.gnu.org/task/index.php?12497 What's the point of this renaming, anyway? It doesn't seem to make any functional difference what the names of the branches you use for dev sources and releases are -- and besides being a practical problem, the scheme you've chosen doesn't follow common git practice, so will be surprising/confusing to people... -miles -- You can hack anything you want, with TECO and DDT.
Re: [IMPORTANT] Savannah issues
Just that by far the most common branch setup in git repos seems to be using master as the dev trunk, with releases, release candidates (etc) on special branches. There are often additional feature branches for even more speculative changes, but master is generally not really safe, even if it's not the most dangerous branch. So master tends to be a sort of middle state (between release/release-candidate branches and speculative feature branches), stuff that is slated for the next release, and has received review -- but may still have some bugs to be shaken out. For complicated long-term changes, people often do development on special feature branches, but smaller and more straight-forward changes generally get put into master directly. Master branches break with some regularity. If you're familiar with gcc's subversion repo setup, it's pretty similar to this (with the subversion trunk being master) Git's own repo actually does this too. Thanks, -miles -- Cat is power. Cat is peace.
bug#13578: [IMPORTANT] A new versioning scheme for automake releases, and a new branching scheme for the Git repository
2013/2/12 Stefano Lattarini stefano.lattar...@gmail.com: Mostly fair points; but the biggest issue with this proposal (not sure why I didn't think of it before, sorry) is that it is not at all clear that a version like 1.13.0.1 is supposed to be a beta release. People will easily mistake it for a stable release. How about this: pick whatever scheme you like for other reasons, and then add -beta to those version numbers. In other words, a purely informational suffix, which is not actually necessary for version sorting... (note that the a, b, etc, suffixes have the same issue) -miles -- Cat is power. Cat is peace.
Re: bug#13578: [IMPORTANT] A new versioning scheme for automake releases, and a new branching scheme for the Git repository
2013/2/12 Stefano Lattarini stefano.lattar...@gmail.com: But what if we want to have multiple betas for, say, Automake 1.14? Today, we can just have 1.13b, 1.13d, 1.13f, ...; how can we do so with the scheme you are proposing? There's always 1.14.0.1, ... Yuck; the new versioning scheme is done exactly to avoid that kind of overly long version numbers Well, I agree in general that too many components is yucky, but keep in mind that these _aren't releases_, so assigning them awkward version numbers doesn't really seem all that annoying. These really aren't part of the historical record. The existing naming scheme for betas does the same thing (uses weird version numbers), but is problematic because it's not mechanically consistent with ordinary version numbers (and so screws up cases such as packaging software). I do agree that removing the leading 1. might be a good idea if it's meaningless in practice. Linux's similar action was good. -miles -- Cat is power. Cat is peace.
Re: bug#13578: [IMPORTANT] A new versioning scheme for automake releases, and a new branching scheme for the Git repository
2013/2/12 Stefano Lattarini stefano.lattar...@gmail.com: Mostly fair points; but the biggest issue with this proposal (not sure why I didn't think of it before, sorry) is that it is not at all clear that a version like 1.13.0.1 is supposed to be a beta release. People will easily mistake it for a stable release. How about this: pick whatever scheme you like for other reasons, and then add -beta to those version numbers. In other words, a purely informational suffix, which is not actually necessary for version sorting... (note that the a, b, etc, suffixes have the same issue) -miles -- Cat is power. Cat is peace.
Re: bug#13578: [IMPORTANT] A new versioning scheme for automake releases, and a new branching scheme for the Git repository
Nate Bargmann n...@n0nb.us writes: I was advised by a Debian maintainer to use tilde '~' as the separator as any text following it will be considered older. For example, in our project 'Hamlib-3.0~git' is older than 'Hamlib-3.0' will be once released. A hyphen or underscore trips this logic up, as I understand it, for both .deb and .rpm formats. This is a Debian-specific syntax, for use in Debian package version numbers. It's a handy way for the Debian maintainer to directly represent various existing package version naming schemes with only mechanical changes (typical, replacing a _ or - with ~), but I don't think there's any intent that upstreams should adopt this syntax directly (though I suppose it doesn't particularly hurt if they do...). -miles -- Abstainer, n. A weak person who yields to the temptation of denying himself a pleasure. A total abstainer is one who abstains from everything but abstention, and especially from inactivity in the affairs of others.
Re: bug#13578: [IMPORTANT] A new versioning scheme for automake releases, and a new branching scheme for the Git repository
Stefano Lattarini stefano.lattar...@gmail.com writes: But what if we want to have multiple betas for, say, Automake 1.14? Today, we can just have 1.13b, 1.13d, 1.13f, ...; how can we do so with the scheme you are proposing? There's always 1.14.0.1, ... Or the widely used in FOSS 1.13.99... [sometimes they start at 90, to leave room for updates, but I suppose you could always just use .99.1, .99.2, ...] -miles -- We are all lying in the gutter, but some of us are looking at the stars. -Oscar Wilde
bug#13524: Improving user experience for non-recursive builds
... and canon_reldir means the same thing, except canonicalized? Yes, canonicalized in a sense quite specific to Automake: http://www.gnu.org/software/automake/manual/automake.html#Canonicalization So, for example, if %reldir% expands to 'foo/bar-baz.d', '%canon-reldir%' will expand to 'foo_bar_baz_d'. Hmm, if that's the case, then I think canon is the wrong term to use, as it typically implies that the result is still in the same domain as the input. This operation seems to be more what one might call sanitizing... [Even if automake uses this term internally, I still think it would be confusing to expose such unusual usage to the user.] -miles -- Cat is power. Cat is peace.
bug#13524: Improving user experience for non-recursive builds
Hmm, if that's the case, then I think canon is the wrong term to use, as it typically implies that the result is still in the same domain as the input. Suggestions for a better name then? Dunno... something like RELDIR_SYM would make sense ... it's a symbol corresponding to RELDIR... -miles -- Cat is power. Cat is peace.
Re: bug#13524: Improving user experience for non-recursive builds
... and canon_reldir means the same thing, except canonicalized? Yes, canonicalized in a sense quite specific to Automake: http://www.gnu.org/software/automake/manual/automake.html#Canonicalization So, for example, if %reldir% expands to 'foo/bar-baz.d', '%canon-reldir%' will expand to 'foo_bar_baz_d'. Hmm, if that's the case, then I think canon is the wrong term to use, as it typically implies that the result is still in the same domain as the input. This operation seems to be more what one might call sanitizing... [Even if automake uses this term internally, I still think it would be confusing to expose such unusual usage to the user.] -miles -- Cat is power. Cat is peace.
bug#13524: Improving user experience for non-recursive builds
%...% seems nice to me. I don't think typability should be a prime factor in deciding, especially such trivial issues such as shifted-characters (like 75% of punctuation in Makefiles is shifted on most keyboards); readability is _much_ more important (and readability in many cases means not too long, especially for something which is likely to appear multiple times in file lists etc...). I don't like the AM_ variants: automake input files are obviously in the automake language, so it seems silly and redundant to include AM_. Incidentally, given the name, I assume the name reldir always refers to a relative path? What is it relative to again? If I want to refer to a source file, do I write $(srcdir)/%reldir%/filename (as opposed to e.g. $(top_srcdir)/%reldir%/filename)? ... and canon_reldir means the same thing, except canonicalized? [In other words, still always relative, e.g. by converting to an absolute canonical name using some sort of truename function, and then removing the source-directory prefix.] Thanks, -miles -- Cat is power. Cat is peace.
bug#13524: Improving user experience for non-recursive builds
Stefano Lattarini stefano.lattar...@gmail.com writes: E.g., if I have a directory foo that has sources etc, and builds some specific targets, then I can isolate the automake stuff for foo by using an include file foo/Makefile.am.inc or something, and then putting an appropriate include in the top-level Makefile.am. But it's a bit annoying, in that AFAICT, all filenames, etc, in foo's Makefile fragment must explicitly include the directory name. Yes, and this issue has come up several times already. Nobody has been bothered enough to attempt a patch, though, at least so far. E.g., if it builds a library, foo/Makefile.am.inc might look like: libfoo_a_SOURCES = foo/oink.c foo/barf.c foo/barf.h ... For longish directory names, this can really bloat things up... Someone (probably Eric Blake, but I'm not 100% sure) once noted that this issue could be mitigated with simple indirections with usual make macros: d1 = wow/a/very/very/insanely/long/directory/name wow_a_very_very_insanely_long_directory_name_prog_SOURCES = \ $(d1)/a.c $(d1)/b.c ... $(d1)/z.c Yes, that's the method I currently use, but it's pretty ugly, and kind of fiddly -- you need to use a unique macro name for every subdir, and practicality means probably using some variant of the directory name for that... meaning you probably have longish variable names in practice. Ugliness is better than broken but frankly I don't want it; one of automake's big attractions for me is that it allows one to write highly readable makefiles that are more easily maintainable because of their relative simplicity and lack of boilerplate (which hinders readability by obscuring significant content). Stuff like this feels very similar to typical Makefile boilerplate, and I think makes for more fragile, less maintainable Makefiles. This is probably too automatic; but Bob Friesenhahn suggested Automake could recognize special substitutions, like %CURDIR% and %XCURDIR%, so that you could simply use in %XCURDIR%_prog_SOURCES = %CURDIR%/a.c %CURDIR%/b.c ... %CURDIR%/z.c This is less fragile, but still pretty grotty, that's going to result in makefile[-fragment]s that feel like they're full of boilerplate. Really, I'm thinking about something _more_ magic. I want to write filenames as I think about them: ..something...: a.c b.c ... z.c If something that does automagic munging of all filenames in a makefile fragment is too magic, at least maybe this could be done with some sort of simple rewrite mechanism to at least automake the common factor. E.g.: libfoo_SOURCES[libfoo/]: a.b b.c ... z.c This sort of thing would have the advantage of being relatively stupid (a simple explicit string rewrite), while still being able to eliminate much of the redundancy... Thanks, -miles -- 「寒いね」と話しかければ「寒いね」と答える人のいるあったかさ [俵万智]
Re: combined report for make check?
Stefano Lattarini stefano.lattar...@gmail.com writes: The best solution is on the user-side IMHO: fix the build system to use less (ideally none) make recursion. Both the parallel and serial testsuite harness should support that setup OOTB. It would be nice if automake had some more features for that... E.g., if I have a directory foo that has sources etc, and builds some specific targets, then I can isolate the automake stuff for foo by using an include file foo/Makefile.am.inc or something, and then putting an appropriate include in the top-level Makefile.am. But it's a bit annoying, in that AFAICT, all filenames, etc, in foo's Makefile fragment must explicitly include the directory name. E.g., if it builds a library, foo/Makefile.am.inc might look like: libfoo_a_SOURCES = foo/oink.c foo/barf.c foo/barf.h ... For longish directory names, this can really bloat things up... It would be really cool if there was some way of telling automake hey, for every filename mentioned in this file, try to use a prefix of ... I dunno whether that would be associated with the include directive, with the makefile fragment, or what, but... anyway. Does automake have some feature like this that I've missed? Or has anybody thought about it? Thanks, -miles -- I'd rather be consing.
Re: looking for a good example of non-recursive Make using project
NightStrike nightstr...@gmail.com writes: If you include src/more/Makefile.am into src/Makefile.am (a perfectly valid thing to do), you will be unpleasantly surprised that src/more/Makefile.am has to actually know where it is in the source tree. It needs lines like this: prog_SOURCES += more/file3.c more/file4.c and **NOT** this: prog_SOURCES += file3.c file4.c It's really annoying. It means that renaming a directory is reaaly hard. Yeah, it would be nice if automake had some sort of mechanism to allow more natural names in files included from subdirs... What that mechanism would be, though, I have no idea. Typically when I've done this, I've used the variable idea mentioned by Eric to at least make it less annoying, but it's still ugly and kind of a pain... [e.g. s = my/dir blah_OINK = ${s}/file1.cc ${s}/file2.cc ... ] -miles -- Selfish, adj. Devoid of consideration for the selfishness of others.
Re: naming convention for object files.
Nicolas Bock nicolasb...@gmail.com writes: libsomething_la_CPPFLAGS = -I../../ ... the naming changed from a.F90 - libsomething_la-a.lo to a.F90 - a.lo. Very strange. It's an annoying, but documented, effect of using per-library CFLAGS: when you do that, automake decides that it must generate unique object files for that library, and so uses the library name as a prefix for the object file. This is presumably because it's _possible_ -- although I think very rare -- to use the same source file in multiple libraries, and with per-library flags, the resulting object files may actually differ (whereas without per-library flags, they'd be the same). [I wish automake would only use the prefixes in cases where a source file is _actually_ used in multiple libraries; since I think it almost never happens that people actually do this, that would mean almost no object files would use the prefix. Maybe that's annoying to implement though...] You _can_ change the name of the prefix used, by setting the SHORTNAME attribute, e.g. libbozo_la_SHORTNAME = bozo -miles -- Once, adj. Enough.
Re: Dynamic package version numbers with Autoconf and Automake
Stefano Lattarini stefano.lattar...@gmail.com writes: Actually, it depends. Where and why do you use such dynamically-computed version number in exactly? That seems the real question. My own method is to have: (1) The primary version number is based on VCS info (this is obviously unavailable for source trees not based on a VCS checkout). (2) The autoconf version number (in AC_INIT) is used as a backup/default only when VCS info is unavailable. This number is relatively static, and typically only updated after a release. (3) The final version info is updated (using VCS info and/or autoconf version info) at make time using a script, and when it changes, only causes a source file (e.g., version.c) to change. This means that although some things are rebuilt after a commit (version.o, and relinking of any binaries that use it), the amount of rebuilding is relatively minor while still yielding accurate info. -miles -- Non-combatant, n. A dead Quaker.
Re: distinguish automake 1.11 from 1.12+ at autoconf time
Ralf Corsepius ralf.corsep...@googlemail.com writes: My issue is ...rantrantblatherblather... Please start a new thread when your message has bugger all to do with the previous message. Thanks, -miles -- The car has become... an article of dress without which we feel uncertain, unclad, and incomplete. [Marshall McLuhan, Understanding Media, 1964]
Re: distinguish automake 1.11 from 1.12+ at autoconf time
Ralf Corsepius ralf.corsep...@googlemail.com writes: Pardon, may-be I am missing something, but in my understanding I am having the same issue as the OP: No, you were just looking for an excuse to start ranting... -miles -- Yorton, Wressle, and Gospel Oak, the richness of your heritage is ended. We shall not stop at you again; for Dr Beeching stops at nothing.
Re: GNU Automake 1.12.1 released
[following up my previously message] This is an earlier post I made, which analyzes the Lua string library w/r/t UTF-8: http://lua-users.org/lists/lua-l/2012-02/msg00241.html -Miles -- Selfish, adj. Devoid of consideration for the selfishness of others.
Re: compiling different files with different C compilers
NightStrike nightstr...@gmail.com writes: suppose that my program is composed of 2 files f1.c and f2.c. f1.c is written in C89 and f2.c in C99. I would like that f1.c is compiled with a C89 compiler and f2.c is compiled with a C99 compiler. How can I achieve that in Makefile.am ? Also, in configure.ac, which macro should I use ? AC_PROG_CC ? AC_PROG_CC_C99 ? Can you just pass different CFLAGS, -std=c89, -std=c99? Of course, that sort of just shifts the focus; one still needs to figure out: (1) How to ensure the compiler accepts those particular flags; writing an autoconf macro to test whether the compiler _accepts_ the options, is easy enough, but what does one do when it doesn't...? Give up and tell the user to figure it out himself? (2) How to specify file-specific compiler flags -- something which automake is not so good at... the easiest way is to put files needing special treatment in their own library, but of course that's a bit clumsy (and even then, there are slightly annoying artifacts, like the library- object-file prefix automake adds when using library-specific flags). -miles -- Suburbia: where they tear out the trees and then name streets after them.
Re: generated documentation
Werner LEMBERG w...@gnu.org writes: (1) If the user unpacks the tarball, the rules are not executed. (2) If the user does a bootstrap from the repository to do `make dist', they are needed. (3) If the user changes a doc source file in the tarball and wants to regenerate the documentation, the rules are needed too. ... (b) If the tests for pandoc and pdftex fail, disable the build rules or provide no-ops. This is OK for (1) and (3), but not satisfactory for (2). Why is (b) unsatisfactory for (2)? I'm presuming that when pandoc/pdftex are not found, not only are the standard makefile rules using them disabled, but alternative rules enabled which just say something like: if have_pandoc foo.html: foo.tex $(PANDOC) balah blah blah else foo.html: foo.tex echo 12 OMGOMG can't generate $@ echo 12 please install pandoc and reconfigure endif -miles -- `Suppose Korea goes to the World Cup final against Japan and wins,' Moon said. `All the past could be forgiven.' [NYT]
Re: bug#11323: automake-1.11.4 regression
Ralf Corsepius ralf.corsep...@rtems.org writes: -EFAILMAINTAINER No need to be obnoxious. a) This kind of changes is inappropriate within a release series. b) You don't seem to be aware about the harmfulness of this change of yours. Please comprend that this change was a mistake and revert it! I think some actual data might be in order... _which_ packages break? In what way? [I mean: what's the actual effective on the package's use? In some cases a failure to create an empty directory might be a change in behavior, but not particularly critical...] Thanks, -Miles -- Bore, n. A person who talks when you wish him to listen.
Re: automake distcheck question
songbird songb...@anthive.com writes: That extra stuff is what will allow your package to work on the machines of your users without requiring them to have all the developer's tools you're using (autoconf, automake, aclocal, bison, flex). ok. i'm one of those people who hates extra files being left behind when they are autogenerated Hmm, be wary of gnulib then ... it's even _more_ sloppy than the autotools! :] -miles -- Idiot, n. A member of a large and powerful tribe whose influence in human affairs has always been dominant and controlling.
Re: bug#11034: Binutils, GDB, GCC and Automake's 'cygnus' option
Pedro Alves pal...@redhat.com writes: OK, you've all made clear you have your sensible reasons to have the '.info' ... it available only though the new, undocumented option named (literally) hack!info-in-builddir. I hope this is acceptable to you. ... *undocumented* option '!hack!info-in-builddir' (whose name should made it clear that it is not meant for public consumption). So will this be called a hack forever, or will the naming be revisited before a release? IMO, either the feature is sensible, and there doesn't seem to be a good reason other users couldn't also use it, and hence it should get a non-hackish name and be documented; or it isn't sensible, and then it shouldn't exist. Why the second-class treatment? I suspect there are better, cleaner, ways to accomplish the underlying goal, but I suppose the gcc maintainers don't want to spend the time fiddling around with their build infrastructure for such a minor issue... -miles -- Alone, adj. In bad company.
Re: bug#11034: Binutils, GDB, GCC and Automake's 'cygnus' option
Pedro Alves pal...@redhat.com writes: OK, you've all made clear you have your sensible reasons to have the '.info' ... it available only though the new, undocumented option named (literally) hack!info-in-builddir. I hope this is acceptable to you. ... *undocumented* option '!hack!info-in-builddir' (whose name should made it clear that it is not meant for public consumption). So will this be called a hack forever, or will the naming be revisited before a release? IMO, either the feature is sensible, and there doesn't seem to be a good reason other users couldn't also use it, and hence it should get a non-hackish name and be documented; or it isn't sensible, and then it shouldn't exist. Why the second-class treatment? I suspect there are better, cleaner, ways to accomplish the underlying goal, but I suppose the gcc maintainers don't want to spend the time fiddling around with their build infrastructure for such a minor issue... -miles -- Alone, adj. In bad company.
Re: dealing with executable shell scripts
Bob Friesenhahn bfrie...@simple.dallas.tx.us writes: This is fine and good for scripts which are formally installed, but while they (originals) are in the source tree, there are definite benefits for scripts to have a useful extension. This is particularly true if people build in the source tree so that scripts and binaries may be intermingled. Exactly. -miles -- Yo mama's so fat when she gets on an elevator it HAS to go down.
Re: dealing with executable shell scripts
Russ Allbery r...@stanford.edu writes: [Relying on source-code execute bits always being correctly maintained is one of those things that ... well... doesn't really feel very robust. I dunno, maybe it's just me...] Doesn't every package with a configure script rely on this? I suppose that people could chmod +x the configure script before running it, but I've never had to do that. You can just do sh configure... (and I think autoconf/automake are careful to never rely on the execute bits of helper scripts being set). Anyway, it's not really the same issue. configure is either part of an official distribution tarball (which is a relatively controlled environment) or explicitly built (where the build process can arrange for the execute bit to be set if appropriate). The shell-scripts in question, however, are source files, and so come directly via whatever mechanism you use to get source files -- tar, cp, random-vcs-xyz, In many cases such mechanisms can preserve execute bits, but ... it doesn't feel quite right to rely on that always being the case -miles -- Barometer, n. An ingenious instrument which indicates what kind of weather we are having.
Re: dealing with executable shell scripts
NightStrike nightstr...@gmail.com writes: The shell-scripts in question, however, are source files, and so come directly via whatever mechanism you use to get source files -- tar, cp, random-vcs-xyz, In many cases such mechanisms can preserve execute bits, but ... it doesn't feel quite right to rely on that always being the case You aren't preserving anything in the example in your original post. You are explicitly setting the execute bits for the installed script. Right, that's exactly the point of the example. Putting anything in xxx_SCRIPTS does the same thing -- automake will ensure that they are set execute with $(INSTALL). Sure, that works for the final installed scripted, but right now the discussion was about the in-tree executable script. You are now saying that you want the source distribution to contain files with execute permissions, but you still haven't explained how you were doing that in your original example. No I'm not. I'm saying I want there to be an executable version of the _destination script_ in my build [often the same as the source] tree, so the developer can say something like ./myscript in the build tree to test it. In other words, I'd like my program to be usable in uninstalled mode, kinda like Emacs is. For normal executables, this is more or less the default state (if your program handles finding its data files etc, appropriately), and libtool goes to quite a bit of effort to make it work for dynamically linked executables; I'd like the same thing for script files. My original example gave one way of doing this: copy some source file to the destination executable script name, and make the latter executable. Other people have been sort of saying that maybe I should instead just make the source file itself executable, but I'm disagreeing. Your original example was copying myprog.sh to myprog and setting the execute bits for myprog. That does nothing to guarantee that myprog.sh also has execute bits in your source distribution. Correct. That's because my goal is not the latter. -Miles -- Twice, adv. Once too often.
Re: dealing with executable shell scripts
2012年3月21日13:13 NightStrike nightstr...@gmail.com: Here's a better question. How do you insure that your current file is executable? Do it the same way. Er cp $ $@ chmod +x $@ ... :] [Relying on source-code execute bits always being correctly maintained is one of those things that ... well... doesn't really feel very robust. I dunno, maybe it's just me...] -miles -- Cat is power. Cat is peace.
Re: dealing with executable shell scripts
2012年3月21日8:33 NightStrike nightstr...@gmail.com: bin_SCRIPTS doesn't actually seem to do much of anything -- you still have to add your own rules to handle all the actual work, need to fiddle with EXTRA_DIST and CLEANFILES, etc. Indeed, doing what I You can avoid hacking EXTRA_DIST if you change your primary from: bin_SCRIPTS = aaa to: dist_bin_SCRIPTS = aaa That's going to distribute aaa, though, right, not the actual source e.g. aaa.sh? -miles -- Cat is power. Cat is peace.
Re: dealing with executable shell scripts
2012年3月21日9:32 NightStrike nightstr...@gmail.com: dist_bin_SCRIPTS = aaa That's going to distribute aaa, though, right, not the actual source e.g. aaa.sh? Yes. There's an earlier email in this thread from somebody illustrating that you don't need to morph from source to script if the file doesn't actually get changed. Is there a way to ensure that the raw aaa is executable in the source directory (I'd like everything to be usable whether installed or not). thanks, -Miles -- Cat is power. Cat is peace.
dealing with executable shell scripts
Is there a recommended way for dealing with binaries that are simple shell scripts in automake? I currently use something like the following: bin_PROGRAMS = myprog myprog_SOURCES = myprog.sh myprog: myprog.sh %: %.sh $(shbin_verbose)cp $ $@; chmod +x $@ shbin_verbose = $(shbin_verbose_$(V)) shbin_verbose_ = $(shbin_verbose_$(AM_DEFAULT_VERBOSITY)) shbin_verbose_0 = @echo SHBIN $@; But the fact that I need to explicitly state the myprog: myprog.sh dependency makes it feel a bit hackish, and I'm wondering if there's some more proper method... Thanks, -miles -- 古池や 蛙飛び込む 水の音 [松尾芭蕉]
Re: dealing with executable shell scripts
Harlan Stenn st...@ntp.org writes: What's the problem with bin_SCRIPTS? Hmm, I didn't know about it, but ... reading the documentation, bin_SCRIPTS doesn't actually seem to do much of anything -- you still have to add your own rules to handle all the actual work, need to fiddle with EXTRA_DIST and CLEANFILES, etc. Indeed, doing what I did (putting the script in bin_PROGRAMS and adding a rule to build it) actually seems _easier_, as automake will do more automatically, and there's less to keep track of for the programmer -Miles -- In New York, most people don't have cars, so if you want to kill a person, you have to take the subway to their house. And sometimes on the way, the train is delayed and you get impatient, so you have to kill someone on the subway. [George Carlin]
Re: pkglibdir, pkgdatadir and program_transform_name
Stefano Lattarini stefano.lattar...@gmail.com writes: Any transformation of a $(pkg*dir) by $(program_transform_name) would be a bug; if you encounter such an issue, I'd be grateful if you report it. But I'm pretty sure the inconsistency you are seeing here is due to another reasons (maybe some Makefile.am or configure.ac settings you're missing?) When I was googling earlier (due to this same thread on the autoconf mailing list), I found patches to automake to _implement_ such a transformation posted to the grub mailing list... so maybe it's a modified version of automake. -Miles -- Immortality, n. A toy which people cry for, And on their knees apply for, Dispute, contend and lie for, And if allowed Would be right proud Eternally to die for.
Re: pkglibdir, pkgdatadir and program_transform_name
Eric Blake ebl...@redhat.com writes: I think it's worth pursuing a patch to the GNU Coding Standards that allows a standardized configure option that allows one to specify an alternate package name, so that things like $(pkglibdir) become $(libdir)/$(alternate_package_name) A standardized option (described by the GNU Coding Standards) would not be a bad idea, but of course there's no need to wait for that to add the feature to autoconf... just having it there will help projects like grub. :] -Miles -- We live, as we dream -- alone
Re: pkglibdir, pkgdatadir and program_transform_name
Stefano Lattarini stefano.lattar...@gmail.com writes: Any transformation of a $(pkg*dir) by $(program_transform_name) would be a bug; if you encounter such an issue, I'd be grateful if you report it. But I'm pretty sure the inconsistency you are seeing here is due to another reasons (maybe some Makefile.am or configure.ac settings you're missing?) When I was googling earlier (due to this same thread on the autoconf mailing list), I found patches to automake to _implement_ such a transformation posted to the grub mailing list... so maybe it's a modified version of automake. -Miles -- Immortality, n. A toy which people cry for, And on their knees apply for, Dispute, contend and lie for, And if allowed Would be right proud Eternally to die for.
Re: pkglibdir, pkgdatadir and program_transform_name
Eric Blake ebl...@redhat.com writes: I think it's worth pursuing a patch to the GNU Coding Standards that allows a standardized configure option that allows one to specify an alternate package name, so that things like $(pkglibdir) become $(libdir)/$(alternate_package_name) A standardized option (described by the GNU Coding Standards) would not be a bad idea, but of course there's no need to wait for that to add the feature to autoconf... just having it there will help projects like grub. :] -Miles -- We live, as we dream -- alone
Wait, isn't rpath supposed to be set automagically?
I thought that as long as one used .la libraries, automake+libtool was supposed to handle all the grotty stuff like rpath automatically, adding -rpath $(libdir) if you depend on libraries installed to libdir and libdir isn't on the system library search path. [Yeah, I also know some people hate rpath, but ...] But ... it doesn't seem to. Is something broken, is there an option I should set... or? [I guess I can add -rpath blahblaha somewhere in Makefile.am, but I don't really want to add system-dependent stuff that libtool's supposed to be handling; isn't that why libtool exists in the first place?] automake version 1.11.3, libtool version 2.4.2 Thanks, -Miles Example: configure.ac: AC_INIT([blah], [0.1], [bob]) AM_INIT_AUTOMAKE([foreign]) LT_INIT AC_PROG_CC AC_CONFIG_FILES([Makefile]) AC_OUTPUT Makefile.am: bin_PROGRAMS = blah lib_LTLIBRARIES = liboink.la blah_SOURCES = blah.c blah_LDADD = liboink.la liboink_la_SOURCES = oink.c blah.c: extern void oink (); int main () { oink (); } oink.c: void oink () { } Commands: $ autoreconf --install ... $ ./configure ... $ make ... $ sudo make install ... $ /usr/local/bin/blah /usr/local/bin/blah: error while loading shared libraries: liboink.so.0: cannot open shared object file: No such file or directory $ LD_LIBRARY_PATH=/usr/local/lib /usr/local/lib/blah $ ldd /usr/local/bin/blah linux-vdso.so.1 = (0x7fff247c5000) liboink.so.0 = not found libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f5ceeb26000) /lib64/ld-linux-x86-64.so.2 (0x7f5ceeed1000) $ readelf -d /usr/local/bin/blah | grep RPATH $ -- Brain, n. An apparatus with which we think we think.
Re: Wait, isn't rpath supposed to be set automagically?
2012年2月27日0:58 Peter Johansson troj...@gmail.com: On which system do you experience this? I've seen this problem on Fedora and the problem was that the linker search path and the dynamic loader search path were different. IIUC libtool sets -rpath if a used library is outside linker path. Debian sid I'm not sure how libtool determines whether or not the libdir is outside the linker path (as it obviously is, since execution fails... :). If it searches /etc/ld.conf, it might get confused because here it just says include /etc/ld.conf.d/*.conf -- but if it gets confused, I'd think it would default to assuming it was outside the search path, not to assuming it was in the search path. Of course it could be some debian patch that is causing the problem, but it sounds like not. [I've tried a little to examine libtool to see what it's doing, but it's ... very ... hard to read.] The solution for me was to include '/usr/local/lib' in '/etc/ld.so.conf' as suggested here http://tldp.org/HOWTO/Program-Library-HOWTO/shared-libraries.html. That's not generally a solution though, because it requires modifying /etc. I think it's desirable that it just work wherever it gets installed, and no matter who installs it (e.g. prefix=$HOME should work, and shouldn't require setting LD_LIBRARY_PATH). -miles -- Cat is power. Cat is peace.
Re: Wait, isn't rpath supposed to be set automagically?
2012年2月27日1:46 Peter Johansson troj...@gmail.com: I think it's desirable that it just work wherever it gets installed, and no matter who installs it (e.g. prefix=$HOME should work, and shouldn't require setting LD_LIBRARY_PATH). In my case it did work with prefix=$HOME because in that case -rpath was set. Have you tried with ./configure --prefix=$HOME? Hmm, I tried: configure --prefix=/tmp/oinker ... and it worked then! So why is /usr/local different? Hmm How odd: $ cat /etc/ld.so.conf include /etc/ld.so.conf.d/*.conf $ cat /etc/ld.so.conf.d/*.conf /usr/lib/atlas # libc default configuration /usr/local/lib ... So it looks like /usr/local/lib is _supposed_ to be searched. But hmm grovel, grovel Ok, the real problem seems to be that the system maintains a static cache of _library files_ (not directories) for ld.so to use, and simply adding a library doesn't update this cache. Just using the command: sudo ldconfig after installing my package makes everything work! ARg I'll refrain from commenting on the people who designed this, but maybe a note in the automake manual saying on linux systems, you may have to run the ldconfig program to make the system aware of newly installed dynamic libraries in system directories. would be a good thing... Thanks, and sorry for the noise! -miles -- Cat is power. Cat is peace.
Re: Wait, isn't rpath supposed to be set automagically?
2012年2月27日9:41 Bob Friesenhahn bfrie...@simple.dallas.tx.us: Just using the command: sudo ldconfig after installing my package makes everything work! This is a function that libtool normally performs if it is used properly. I did: sudo make install Is that not using it properly? -miles -- Cat is power. Cat is peace.
Re: Wait, isn't rpath supposed to be set automagically?
2012年2月27日16:21 Russ Allbery r...@stanford.edu: This is a function that libtool normally performs if it is used properly. I did: sudo make install Something needs to run libtool --mode=finish. I thought Automake normally arranged to run that at the end of make install. Is that not happening for some reason? I don't see anything like that in the Makefile.in generated by automake (using the example project from my initial message in this thread)... $ grep finish Makefile.in $ hmm -miles -- Cat is power. Cat is peace.
Re: allowing users to add source files without rerunning the autotools?
Stefano Lattarini stefano.lattar...@gmail.com writes: Still, things are not as easy as it would appear from your sample project. For example, with this rule: $(CC) -c $(CFLAGS) -DIM_STUUPD=1 $ you are losing some important features offered by automake -- most notably, the automatic dependency tracking and the configurable verbosity specification (silent-rules). Which might be OK in some circumstances, but unacceptable in others. Actually, although I provided my own compilation rule, the default rule works too, if the user considers that acceptable (compiler flags the same etc). You need at least one automake-controlled source-file of the same type (.c etc) for automake to generate it, but that's probably a safe bet for most projects. :) Since the default rule generates the dependency information into $(DEPDIR)/*.Po as Makefile fragments, one need only figure out some appropriate way to include fragments into the Makefile for the extra source files. What automake does for source files it knows about is just include $(DEPDIR)/srcfile.Po (apparently include is considered portable make?). The include directive apparently understands wildcards, but is a little tricky to use in the same way, because it fails for non-existant files (automake arranges to make sure the .Po files always exist by just sticking in code to explicitly create them all over the place, which is probably too annoying for this sort of user case). If GNU-make-specific features are OK, then one could just use -include, e.g. in my example -include $(DEPDIR)/stuupd-*.Po. Alternatively, a dummy .Po file could be created by configure that matches the above include pattern (and so prevents an error until some real .Po files exist to match it). Thanks, -Miles -- Carefully crafted initial estimates reward you not only with reduced computational effort, but also with understanding and increased self-esteem. -- Numerical methods in C, Chapter 9. Root Finding and Nonlinear Sets of Equations
Re: allowing users to add source files without rerunning the autotools?
Nick Bowler nbow...@elliptictech.com writes: Interestingly, if you actually stick a line exactly like the above into your Makefile.am, Automake will actually do The Right Thing™ and creates the .Po stub as if you had actually specified the source file normally. Presumably you'd be relying on totally unsupported internal behaviour of Automake in this case, though. :) Heh. :) Another thing is that if the project is happy with discovering out the extra source files at configure-time (as opposed to make-time like my example did), things probably become even easier, as the configure file can do more stuff. One interesting thing would be to add some simple automake feature to allow specifying this stuff explicitly via some interface (e.g. AM_EXTRA_SOURCE_FILES([blah.c barf.h]) in configure, or something like that). That might make it possible to really do this stuff correctly without too much work for the developer, by letting him take advantage of some of the lower-level automake machinery. Thanks, -Miles -- 1971 pickup truck; will trade for guns
Re: allowing users to add source files without rerunning the autotools?
2012/1/20 Bob Friesenhahn bfrie...@simple.dallas.tx.us: One interesting thing would be to add some simple automake feature to allow specifying this stuff explicitly via some interface (e.g. AM_EXTRA_SOURCE_FILES([blah.c barf.h]) in configure, or something like that). That might make it possible to really do this stuff correctly without too much work for the developer, by letting him take advantage of some of the lower-level automake machinery. This sort of thing is already well supported by Automake via Makefile includes. But it does require that automake be executed again. Right, my intent is something that _doesn't_ require that, which can be handled entirely by the configure script. [That's more or less the whole point of this thread, right?] -miles -- Cat is power. Cat is peace.
Re: allowing users to add source files without rerunning the autotools?
Bill Sacks sa...@ucar.edu writes: The dependency issue that Jeff raised is not a problem for us, since we have a script to determine Fortran 90 dependencies. I'm not sure that it will work to have a separate library of the user-added code, since we don't know ahead of time what dependencies there will be between existing code and the user-added code. Your other suggestions are helpful. We'll give this some more thought. I think the main thing to remember is that most traditional make facilities are still available, should automake's higher-level constructs not suffice, and the two can usually co-exist reasonably well. So whatever you did before for user-contributed stuff may well work perfectly fine in Makefile.am (maybe with a few tweaks). -miles -- Cabbage, n. A familiar kitchen-garden vegetable about as large and wise as a man's head.
Re: allowing users to add source files without rerunning the autotools?
Daily, Jeff A jeff.da...@pnnl.gov writes: Yes, all sources must be listed, AFAIK. You could write your own build rules and targets for these files, which would be copied by automake into your eventual Makefile, but that somewhat defeats the purpose of using a build tool if you're writing many of the rules yourself. Er, what? I think automake actually does a pretty good job of making custom rules painless and useful... Even if you use some custom rules for a few files in a Makefile.am, automake can still do about 99% of the heavy-lifting for a project. Typically my approach is to put all the funny stuff in its own library, and have the normal automake targets (99% of them) depend on that, with a few custom rules to dispatch the building of libfunny.a to its own Makefile or something (which can be generated separately by configure if necessary). -Miles -- /\ /\ (^.^) ()) *This is the cute kitty virus, please copy this into your sig so it can spread.
Re: Automake 1.11.2 released
Antonio Diaz Diaz ant_d...@teleline.es writes: Three years and three stable releases have passed since a patch (by Jan Engelhardt) adding dist-lzip support to Automake was sent to this mailing list[1], but Automake 1.11.2 doesn't yet mention lzip anywhere. By contrast xz support was added when xz was at an early beta stage. What's the difference between xz and lzip anyway...? I've never even heard of lzip, but the debian package description makes it sound very similar to xz... -Miles -- Americans are broad-minded people. They'll accept the fact that a person can be an alcoholic, a dope fiend, a wife beater, and even a newspaperman, but if a man doesn't drive, there is something wrong with him. -- Art Buchwald
Re: PCH support
Dave Hart daveh...@gmail.com writes: True, but most C/C++ #includes orders of magnitudes more lines than they contain themselves, so assuming the source code is rearranged to have a precomp.h containing the bulk of #includes, the compile will be notably faster. Faster enough to be worth the annoyance for the developer of twisting his source code to fit the pch style (which seems notably uglier)? Machines are very fast these days, and even using tons of big headers, C++ parsing doesn't seem such a big deal as it maybe used to. [AFAICT, slow compilation seems far more often connected with optimization than parsing.] -Miles -- Barometer, n. An ingenious instrument which indicates what kind of weather we are having.
Re: PCH support
2011/12/26 Olaf van der Spek m...@vdspek.org: Faster enough to be worth the annoyance for the developer of twisting his source code to fit the pch style (which seems notably uglier)? Yes I'm not sure what twisting you're referring too though. Another comment noted that PCH was often ineffective or even counter-productive unless the bulk of your includes are precisely the same between compilation units, and that in practice systems like VS try to get the user to define a single include everything header file (presumably instead of the normal practice of include the stuff you use). Sounds pretty darn ugly (and I expect makes compile times far worse if you _can't_ use PCH in some case)... -Miles -- Cat is power. Cat is peace.
Re: PCH support
2011/12/26 Olaf van der Spek m...@vdspek.org: On Sun, Dec 25, 2011 at 5:31 PM, Miles Bader mi...@gnu.org wrote: 2011/12/26 Olaf van der Spek m...@vdspek.org: Faster enough to be worth the annoyance for the developer of twisting his source code to fit the pch style (which seems notably uglier)? Yes I'm not sure what twisting you're referring too though. Another comment noted that PCH was often ineffective or even counter-productive unless the bulk of your includes are precisely the same between compilation units, and that in practice systems like VS try to get the user to define a single include everything header file (presumably instead of the normal practice of include the stuff you use). Sounds pretty darn ugly (and I expect makes compile times far worse if you _can't_ use PCH in some case)... Is someone forcing you to use PCH? I'm not sure what your point is. Er, of course not (where on earth did that come from)? My initial question was essentially is PCH still a good idea for the average developer? That basically involves examining the details of the tradeoff between benefits (increased compile speed; how much?) and drawbacks (awkward constraints on source style / organization; exactly what is needed to make PCH effective?). This is relevant to automake because the general utility of specialized PCH support in automake has to be weighed against the cost of that support (of course, maybe it's super trivial, I dunno). -miles -- Cat is power. Cat is peace.
Re: [gnu-prog-discuss] Could automake-generated Makefiles required GNU make?
Dave Hart davehart_gmail_exchange_...@davehart.net writes: If anyone knows of examples of non-recursive Makefile implementations that manage to preserve the recursive make property of being able to make in a subdir to make a subset, please share so we can learn from their pioneering. Could you just put a (probably constant) stub Makefile in each subdirectory, which simply invokes the top-level Makefile with an appropriate parameter to indicate which subdirectory it came from...? -Miles -- Religion, n. A daughter of Hope and Fear, explaining to Ignorance the nature of the Unknowable.
Re: Manual merges.
On my own real projects (the ones with real users), I view the version control logs as messages for active developers and ChangeLog as messages for users and occasional developers. So git sees small one-off messages on a regular basis, and the ChangeLog is updated when big user-visible changes are made. Then I review the git logs before each release candidate and edit the ChangeLog to make sure it has a good summary of all user-visible changes. Note that in the traditional GNU source structure, that role (what you're using ChangeLog for) is played by NEWS. -Miles -- (\(\ (^.^) ()) *This is the cute bunny virus, please copy this into your sig so it can spread.
Re: GSoC project idea: non-recursive automake project
Harlan Stenn st...@ntp.org writes: Larry McVoy once said something like In theory, theory and practice are the same. But in practice, they are not. Maybe he did say that at some point, but it's a hoary old quote (attributed to Yogi Berra, among others), and certainly didn't originate with Larry... -Miles -- Love is a snowmobile racing across the tundra. Suddenly it flips over, pinning you underneath. At night the ice weasels come. --Nietzsche
Re: PKG_CHECK_MODULES on system without pkg-config installed?
Jef Driesen jefdrie...@hotmail.com writes: I'm aware of the advantage of using pkg-config. I even supply the *.pc files for my own projects. But the point is that if I want to support systems that don't have pkg-config installed (like Mac OS X in my case), I have to provide a fallback with manual detection anyway. So why not skip pkg-config entirely? I don't even try to fully support systems without pkg-config, I basically just want a fallback so I can get some sort of build on them, maybe with some features disabled. So what I do is use PKG_CHECK_MODULES for normally pkg-config'd packages. It's very concise and easy to use (one line per library, normally). On systems without pkg-config, this will just fail as if the package wasn't installed at all. For a few of the most critical libraries, I use both PKG_CHECK_MODULES (it yields better results when available, is almost trivial, and the support infrastructure is present for other packages anyway), _and_ more traditional probing code as a backup (but see below why it's annoying). And I could simply replace the pkg-config based check with something like: Even that -- rather verbose -- code often isn't enough, because install locations can vary between systems, in a manner that varies from package to package. In such cases, you can either require the user to specify the location with a configure option (and I want to avoid require user input if at all possible), or you can add a loop to guess various common configurations (which make the configure code somewhat more complex as well as being more likely to yield incorrect results). [Note, all of this results from trying to compile on systems where something didn't work!] -Miles -- Faith, n. Belief without evidence in what is told by one who speaks without knowledge, of things without parallel.
Re: PKG_CHECK_MODULES on system without pkg-config installed?
Miles Bader mi...@gnu.org writes: I don't even try to fully support systems without pkg-config, I basically just want a fallback so I can get some sort of build on them, maybe with some features disabled. I should note that although this is my tactic, it's not actually a very important point. For the rest of the reasons mentioned, _even if_ one wants to fully support systems without pkg-config (so one has to add the manual probing code), it's still a good idea to just support both methods: supporting pkg-config doesn't add much complexity to the configure.ac file (the bloat is the manual probing code), and for systems that _can_ benefit from it, it yields superior results. -Miles -- Ocean, n. A body of water covering seven-tenths of a world designed for Man - who has no gills.
Re: PKG_CHECK_MODULES on system without pkg-config installed?
Roger Leigh rle...@codelibre.net writes: This is not meant to sound like a troll, but: is anyone really *really* using static linking in 2011? Sure; it's very useful for specialized libraries that won't be widely used enough to merit the effort to build and install as dynamic libraries. Using/creating static libraries is, after all, _very_ simple (dynamic libraries, well ... not so much). -Miles -- Quotation, n. The act of repeating erroneously the words of another. The words erroneously repeated.
Re: PKG_CHECK_MODULES on system without pkg-config installed?
Jef Driesen jefdrie...@hotmail.com writes: Isn't it easier to just check for the presence of the header file and/or the library file, avoiding pkg-config entirely? Well, I'd prefer not to guess when possible, and not using pkg-config for a package that wants you to use it means you end up guessing at the proper locations/settings. Maybe this often works, but why risk it when it's not necessary? -miles -- Selfish, adj. Devoid of consideration for the selfishness of others.
Re: PKG_CHECK_MODULES on system without pkg-config installed?
Jef Driesen jefdrie...@hotmail.com writes: is pkg.m4 in /usr/share/aclocal ? No. I suppose that file is only present if pkg-config is installed? I'm trying to build on Mac OS X in case that would matter. I use some hacks to make the resulting configure script work even if autoconf can't find pkg.m4. Basically I just use m4_define_default to define stub versions of the PKG_... macros: # Check for pkg-config program, used for configuring some libraries. # m4_define_default([PKG_PROG_PKG_CONFIG], [AC_MSG_CHECKING([pkg-config]) AC_MSG_RESULT([no])]) PKG_PROG_PKG_CONFIG # If the pkg-config autoconf support isn't installed, define its # autoconf macro to disable any packages depending on it. # m4_define_default([PKG_CHECK_MODULES], [AC_MSG_CHECKING([$1]) AC_MSG_RESULT([no]) $4]) ... etc PKG_CHECK_MODULES([libpng], [libpng], [have_libpng=yes], [:]) ... etc -Miles -- Accordion, n. An instrument in harmony with the sentiments of an assassin.
Re: debbugs, and a FAQ, for Autotools
Ralf Hemmecke hemme...@gmail.com writes: Is there actually a good reason, why the autotools are distributed as separate packages (autoconf, automake, libtool, m4)? (Maybe even pkg-config, but I still don't yet know exactly whether it is good for me.) Hmm, why not? Isn't it good general practice to split up packages where the coupling is fairly loose? Many people use autoconf without automake, and the latter has some significant extra dependencies. Also, of course, libtool is extremely optional (and for those that do use it, I imagine it would work well enough without the rest of the autotools). -miles -- Egotist, n. A person of low taste, more interested in himself than in me.
Re: debbugs, and a FAQ, for Autotools
Ralf Hemmecke hemme...@gmail.com writes: Sure. But it is also relevant if one developer adds a macro which is only available in some recent version of automake, say. Another developer might not yet have that automake version. It doesn't really seem any worse than _any_ potential tool incompatibility problem -- compiler version, library version, etc -- though... Usually those issues aren't such a huge deal, because most project try to be relatively portable, and when version dependencies do crop up, they can be dealt with relatively well using simple checks in the configure script. Isn't that what people usually do about autoconf versions too (declare a minimum version in configure.ac)? -Miles -- Rational, adj. Devoid of all delusions save those of observation, experience and reflection.
Re: [CRAZY PROPOSAL] Automake should support only GNU make
hmm, is the tupfile syntax really as horrible as it looks? [ from the examples page: : hello.c | gcc -Wall -c hello.c -o hello.o | hello.o ] -miles -- Ich bin ein Virus. Mach' mit und kopiere mich in Deine .signature.
Re: [CRAZY PROPOSAL] Automake should support only GNU make
Xochitl Lunde xochitl_lu...@tripplite.com writes: I see that I can get the source, but I don't want to have to compile this thing if it's not purely script based. AFAIK, quagmire requires nothing other than GNU make (that was, I guess, the point of it)... Also when I go to quagmire-discuss, there are a bunch of nasty topics that are over 1 month old; nobody's moderating. It's not in active development, but I think it showed some promise when it was, and might be a good base or source of ideas for future work if somebody has the interest. -Miles -- The trouble with most people is that they think with their hopes or fears or wishes rather than with their minds. -- Will Durant
Re: High-Precision NFS Timestamps
Bob Friesenhahn bfrie...@simple.dallas.tx.us writes: Usually the solution to this is to install and run ntp (Network Time Protocol, as offered by 'xntp') on the machines on your network. Is that really a solution? Running ntp makes it a lot more _likely_ that machines will appear to be synchronized to a high degree of precision, but doesn't seem a guarantee... -Miles -- Accord, n. Harmony.
Re: Any way to get rid of -MP parameter to gcc for dependency creation?
On Fri, Jan 7, 2011 at 8:18 AM, Xan Lopez x...@gnome.org wrote: I haven't tested it personally, but I can ask. What I know is that Chromium uses gyp, which on Linux generates Makefiles, and they claim their null-build time is pretty much zero (not sure on which machine, though, so perhaps that's only the case for some huge iron). So it would seem that whatever our problems GNU make shouldn't be one of them. What I can tell you is that our autotools setup is known for being the slowest of the lot :/ (vs Xcode, VisualStudio, CMake, Gyp, ...) Do they actually do the same thing? -miles -- Cat is power. Cat is peace.
Re: Any way to get rid of -MP parameter to gcc for dependency creation?
On Fri, Jan 7, 2011 at 8:34 AM, Xan Lopez x...@gnome.org wrote: Do they actually do the same thing? Yes, we all build WebKit + the some testing tools. The set of files we build is not identical, since it changes by port, but the difference is negligible since most of the files are platform-independent. Of course perhaps we have written our stuff extremely poorly and they haven't, but since I have written part of ours I'm not the best person to comment on that (from the kind of things we need to do to improve it it does not seem to be the case, though). What I meant was, do they all do the same thing _in detail_ -- for instance, if one tracks system header dependencies and the other doesn't, then the latter will most likely be faster, but will have reduced functionality. [Your investigation into the effects of -MP point out one such detail area where build tools may well differ.] If one tool is faster than another, one should of course weight that against the differences in functionality. Ralf mentioned that some of the inefficiency came from build rules intended to do automatic Makefile regeneration; do the other tools do that? Also, some of the inefficiencies in automake-generated Makefiles comes from the attempt to be very portable, both in the set of tools required to do a build (only make, sed, sh, etc, vs the requirement for specific specialized tools to be installed for just building), the versions of those tools (e.g. any vaguely standard make vs GNU make only). If another tool has additional requirements for building, that also is a factor to be weighed against speed. -Miles -- Cat is power. Cat is peace.
Re: Any way to get rid of -MP parameter to gcc for dependency creation?
On Tue, Jan 4, 2011 at 10:22 AM, Xan Lopez x...@gnome.org wrote: Do you know _what_ is taking so long? I mean, disk I/O (stats on a cold disk cache), user CPU time (inefficient algos in make), system CPU time (stats on a warm disk cache)...? Are you talking about the time to process those empty rules that are gone if you get rid of -MP or of the remaining 20 seconds? Both, I suppose. I imagine both have the same inefficiency (just more in the -MP case)... -miles -- Cat is power. Cat is peace.
Re: reword documentation about symbol stripping
John Calcote john.calc...@gmail.com writes: You need to remember the original target audience of GNU software was a group of people that wanted to share free software. Most of them were students or researchers that generally built software distributed in source form. ... That being the case, users were programmers, and programmers are indeed helpless without debug symbols during a crash - that is, unless you're one of those rare types that loves to dig into a good assembly debug session. I think the basic goal is still quite valid though -- the point is not to _require_ that users be programmers, or even to _expect_ them to be, but to _enable_ them to go as far as they want to go. So In cases where some extra power can be granted to the user at little cost, it's good to do so, even if many users never use that power. The concept of system utilities/libraries/etc of being magic you're not supposed to touch or look at, even if you want to has been a problem for decades, though the lines have shifted. -Miles -- Politeness, n. The most acceptable hypocrisy.
Re: reword documentation about symbol stripping
k...@freefriends.org (Karl Berry) writes: I personally would not have written it that way in the first place, but given that it is there now, I don't want to simply replace it with bland text, or occupy rms's time with it, either. Yeah, I think there's nothing particularly offensive about that text, but maybe it could be more explanatory -- it addresses an issue that some people may not know about, so maybe it would be good to briefly explain further? -Miles -- Suppose He doesn't give a shit? Suppose there is a God but He just doesn't give a shit? [George Carlin]
Re: default -g ??!?
MK halfcountp...@intergate.com writes: Ah, it's because of GNU make: No it's not. By default, the Make rules should compile and link with -g, so that executable programs have debugging symbols. Users who don't mind being helpless can strip the executables later if they wish. Nice, flexible software it ain't. That isn't anything GNU make does, it's a _recommendation_ for Makefile writers. Automake, accordingly follows that recommendation, since it's a higher-level too than make, and tries to provide sensible defaults (whereas GNU make has no default compiler options). -Miles -- Christian, n. One who follows the teachings of Christ so long as they are not inconsistent with a life of sin.
Re: default -g ??!?
MK halfcountp...@intergate.com writes: If you say so, then I guess I am imagining things ;) I have never given the issue much thought until now, I suppose I need to do a bit more research on the issue. Indeed, it's often a good idea to do the research _before_ posting flames and rants... -miles -- ((lambda (x) (list x x)) (lambda (x) (list x x)))
Re: AM_V_GEN - better docs
Patrick Rutkowski rutsk...@gmail.com writes: I don't get from that page how to apply to all my $(CC) build commands, and I really want to quiet down this very messy make output I now have. When silent-rules is enabled, you don't need to do anything special in Makefile.am for normal commands (those commands which get invoked via standard make rules), only those commands you invoke explicitly. For commands which you invoke explicitly, you can just add $(AM_V_GEN) before the command name, e.g., change: foo.barf: foo.blarghhh barfify -o $@ $ to: foo.barf: foo.blarghhh $(AM_V_GEN)barfify -o $@ $ That just prints GEN foo.barf when building foo.barf; to print something more target specific, you can follow the pkg_... example in the doc, e.g.: barf_verbose = $(barf_verbose_$(V)) barf_verbose_ = $(barf_verbose_$(AM_DEFAULT_VERBOSITY)) barf_verbose_0 = @echo BARF $@; foo.barf: foo.blarghhh $(barf_verbose)barfify -o $@ $ The doc page there doesn't even suggest where to look to learn how to apply it to your build. It does, kinda, but maybe it doesn't go far enough; right now it says: To enable less verbose build rules, both the developer and the user of the package have to take a number of steps. The developer needs to do either of the following: + Add the silent-rules option as argument to AM_INIT_AUTOMAKE. + Call the AM_SILENT_RULES macro from within the configure.ac file. .. but even after you do that, silent-rules mode defaults to off; the user may then turn it on by using the --enable-silent-rules option to configure. Maybe it would be good to specify what the should be done to make it default to _on_, e.g.: + To enable silent-rules by default, specify an argument of yes to AM_SILENT_RULES, i.e., add AM_SILENT_RULES([yes]) to the configure.ac file [I know that because somebody else posted it on this list recently :] -miles -- 1971 pickup truck; will trade for guns
Re: AM_V_GEN - better docs
Patrick Rutkowski rutsk...@gmail.com writes: I don't get from that page how to apply to all my $(CC) build commands, and I really want to quiet down this very messy make output I now have. When silent-rules is enabled, you don't need to do anything special in Makefile.am for normal commands (those commands which get invoked via standard make rules), only those commands you invoke explicitly. For commands which you invoke explicitly, you can just add $(AM_V_GEN) before the command name, e.g., change: foo.barf: foo.blarghhh barfify -o $@ $ to: foo.barf: foo.blarghhh $(AM_V_GEN)barfify -o $@ $ That just prints GEN foo.barf when building foo.barf; to print something more target specific, you can follow the pkg_... example in the doc, e.g.: barf_verbose = $(barf_verbose_$(V)) barf_verbose_ = $(barf_verbose_$(AM_DEFAULT_VERBOSITY)) barf_verbose_0 = @echo BARF $@; foo.barf: foo.blarghhh $(barf_verbose)barfify -o $@ $ The doc page there doesn't even suggest where to look to learn how to apply it to your build. It does, kinda, but maybe it doesn't go far enough; right now it says: To enable less verbose build rules, both the developer and the user of the package have to take a number of steps. The developer needs to do either of the following: + Add the silent-rules option as argument to AM_INIT_AUTOMAKE. + Call the AM_SILENT_RULES macro from within the configure.ac file. .. but even after you do that, silent-rules mode defaults to off; the user may then turn it on by using the --enable-silent-rules option to configure. Maybe it would be good to specify what the should be done to make it default to _on_, e.g.: + To enable silent-rules by default, specify an argument of yes to AM_SILENT_RULES, i.e., add AM_SILENT_RULES([yes]) to the configure.ac file [I know that because somebody else posted it on this list recently :] -miles -- 1971 pickup truck; will trade for guns
Re: AM_V_GEN - better docs
On Sat, Nov 13, 2010 at 5:19 AM, Stefano Lattarini stefano.lattar...@gmail.com wrote: It might be time to reconsider this decision. If we document it, we should at least advise against it IMHO. ... or just give the pros and cons. That seems likely to be more convincing and less annoying... -miles -- Cat is power. Cat is peace.
Re: Using -MMD instead of -MD for depndency generation
Paul Smith psm...@gnu.org writes: As for why 3.82 is slower, unfortunately I'm having problems figuring it out. I compiled with gprof but the cumulative profiled code in GNU make only took 6 seconds or so, so I suppose the other 24 seconds must be in libc or someplace... but trying to install a profile-enabled version of libc on my system did not succeed (or rather the package install worked but linking with the profiled libc did not work). The prof program distributed with recent linux kernels is extremely useful too -- it's much easier and less intrusive to use than gprof in many cases. [The annoyance is that it's still not packaged in debian for whatever reason... you can fetch the linux sources and just build in the prof directory, but you need to make sure the source versions match the your kernel you're using, since the kernel interfaces have changed around at various points.] -Miles -- Suppose He doesn't give a shit? Suppose there is a God but He just doesn't give a shit? [George Carlin]