Re: Building prog first

2010-03-27 Thread Ralf Wildenhues
A late hello,

* Robert J. Hansen wrote on Tue, Mar 23, 2010 at 12:12:46AM CET:
 On 3/22/10 6:50 PM, John Calcote wrote:
  Reuben, you've just hit upon one of the two most significant problems
  with Javadoc and the like (including doxygen, man pages, and info pages):
 
 Agreed -- which is why I think it would be wonderful if there was strong
 Autotools support for literate programming.  Unfortunately, my m4-fu is
 not strong enough to figure out the appropriate macros to discover
 ctangle, cweave, etc.

One of the main issues with ctangle, cweave etc. is that they tend to
have an unpredictable set of output files.  Even just haveing more than
one output file by default is a bit difficult to map to 'make' semantics
(bug it's possible; see 'info Automake Multiple Outputs), but only
knowing the set of output files from looking at the *contents* of the
input files, and possibly the command line options of the rule to
generate the output files, works very much against 'make' and Automake.
I wish Knuth had considered that part a bit more.

That said, if somebody can come up with a decent scheme for how to
process things, that would be idiomatic for typical users of these
file types, I'm open to including rules for these in Automake.  If you
don't know how to translate this into m4 macros or a patch against
automake.in, decribe things in a way that we can understand what exactly
automake is supposed to do.  Typically, that already helps a lot for the
actual implementation.

Cheers,
Ralf




Re: Building prog first

2010-03-24 Thread Steffen Dettmer
On Tue, Mar 23, 2010, Reuben Thomas r...@sc3d.org wrote:
 On 23 March 2010 10:15, Steffen Dettmer wrote:
  * On Mon, Mar 22, 2010, Reuben Thomas r...@sc3d.org wrote:
   * 2010/3/22 Russell Shaw rjs...@netspace.net.au:
[on this ident level, see at the end]
   poor support for installing interpreted languages,
   and also conversely for build-time compiled programs.
 
   Yes, also for coffee-cooking there is poor support only. :-)

 Sure, but autotools is for building programs, not for making coffee.

Yes, but in the same way someone can argue that it is to compile
or cross-compile packages, not to
cross-compile-and-create-tools-on-the-fly.
You can create tools but putting in in an own package (which IMHO
is the common case, usually you do not include compiler or bison
sources etc in the package).

What I wanted to say was that there is a way how autoconf
supports that (having a package for the needed tools), so I would
not like to pay the additional complexity to get a `shorter' way
(which to me even has a bit of a taste of a hack...).

  I don't think build-time compiled C programs shall be
  suppored while cross compiling. I think it already is complex
  enough.  Otherwise you had to do all checks twice and end up
  in many variables with confusing names, and those who are not
  cross-compiling probably accidently will mix them.

 On the contrary, this is a very useful feature (why should one
 not be able to build host programs when cross-compiling?)

Yes, coffee-cooking also would be a very useful feature (why
should one not be able to have coffee while waiting for the
cross-compilation process?) :-)
SCNR.

Autoconf supports that. Just make a package for the tool and
install it. I know this is inconvenient in your special case.
Also I don't like too big package dependencies (a pain if someone
must install heaps of packages to get something compiled - if
someone here disagree, make an experiment: install a ten years
old linux and install a recent 3D game on it or KDE5 or so :-)).

 for which support in autoconf would simplify developers' life
 (even the ad-hoc support in binutils that I mentioned is pretty
 easy to use).

Yes, I see your point.
But it's complex... How do users specify to use a non-standard
compiler with special flags to compile your helper tool?

I though of perl, but (A), i don't like slow tools,
 
  (I think Perl is fast)

 Me too, the above assertion was not written by me! You missed
 the author line at the top from the original author of these
 double-quoted comments.

Yes, I know and the ident level is correct; sorry for not
including the poster's name (I fixed it this time, hopefully
correct, gmail threading is not that good and in my mutt box I
already deleted the older messages).
  (I didn't wrote to you but to the list and I never ever wanted
  to blame or critise anyone!)

oki,

Steffen




Re: Building prog first

2010-03-23 Thread Ralf Wildenhues
Hello Reuben,

* Reuben Thomas wrote on Mon, Mar 22, 2010 at 04:44:17PM CET:
 2010/3/22 Russell Shaw:
  Steffen Dettmer wrote:
  * On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:
  BTW, execution of built programs like this makes your package unsuitable
  for cross-compilation.  Just so you're aware of that.
 
 Not true.

Huh?  On the level of implementation Russel is working on, that is very
much true.  Please read the whole thread; I didn't say this is
impossible.

 automake does not have explicit support for building
 programs with the host compiler when cross-compiling, but I have done
 this successfully in the past when I needed precisely to build a
 program on the host when cross compiling, using binutils's
 BFD_CC_FOR_BUILD macro. It's a pity some version of this macro isn't
 in autoconf, or even autoconf-archive; I shall do the latter.

Why not go for Autoconf?  That is where it should belong, and yes,
better support for this situation would be very nice to have integrated
better.

 This illustrates a weirdness of autotools: poor support for installing
 interpreted languages, and also conversely for build-time compiled
 programs.

Hey, as they say, patches welcome!

Cheers,
Ralf




Re: Building prog first

2010-03-23 Thread Steffen Dettmer
On Mon, Mar 22, 2010 at 4:44 PM, Reuben Thomas r...@sc3d.org wrote:
 Not true. automake does not have explicit support for building
 programs with the host compiler when cross-compiling, but I
 have done this successfully in the past when I needed precisely
 to build a program on the host when cross compiling, using
 binutils's BFD_CC_FOR_BUILD macro. It's a pity some version of
 this macro isn't in autoconf, or even autoconf-archive; I shall
 do the latter.

I guess this is a hack and a burden to maintain.

When cross-compiling, why compiling a local tool?
Isn't the correct way to natively compile the local tool,
then use it to cross-compile the package?

 This illustrates a weirdness of autotools: poor support for
 installing interpreted languages, and also conversely for
 build-time compiled programs.

Yes, also for coffee-cooking there is poor support only. :-)

I don't think build-time compiled C programs shall be suppored
while cross compiling. I think it already is complex enough.
Otherwise you had to do all checks twice and end up in many
variables with confusing names, and those who are not
cross-compiling probably accidently will mix them.

  I though of perl, but (A), i don't like slow tools,

(I think Perl is fast)

  (C), i find making build-programs
  in C much more concise than scripting and i can debug it in ddd/gdb.

You can debug Perl in DDD.

 This is interesting, as it doesn't match mine or
 commonly-reported experience (translating my build-time
 programs from C to Perl made them shorter, easier to read and
 fix, and no slower to run, although I wasn't doing more than
 grepping 15k lines of C and writing some of it back out again).


$ time perl -e \
  'for($n=0;$n45;$n++) { printf %08d %60s EOL\n, $n, ; }'  x

real0m0.713s

$ cat x.c
#include stdio.h
int main(void)
{
   int n;
   for(n=0; n45;n++) {
  printf(%08d %60s EOL\n, n, );
   }
   return 0;
}

$ time make x
gcc -Wall -Wmissing-prototypes -fstrict-aliasing -D_GNU_SOURCE -ansi
-ggdb  -ggdb  x.c   -o x
real0m0.076s

$ time ./xx2
real0m0.301s


so 713ms vs. 377 ms.

Interesting that up to around 100k Perl is even faster:

$ time perl \
  -e 'for($n=0;$n10;$n++) { printf %08d %60s EOL\n, $n, ; }'  x
real0m0.167s


$ time make x
real0m0.081s
$ time ./xx2
real0m0.079s


(of course those figures are far away from being exact; they just prove
how fast perl is: same magnitude as C)


:-)

SCNR.


oki,

Steffen




Re: Building prog first

2010-03-23 Thread Russell Shaw

Steffen Dettmer wrote:

On Mon, Mar 22, 2010 at 4:44 PM, Reuben Thomas r...@sc3d.org wrote:

Not true. automake does not have explicit support for building
programs with the host compiler when cross-compiling, but I
have done this successfully in the past when I needed precisely
to build a program on the host when cross compiling, using
binutils's BFD_CC_FOR_BUILD macro. It's a pity some version of
this macro isn't in autoconf, or even autoconf-archive; I shall
do the latter.


I guess this is a hack and a burden to maintain.

When cross-compiling, why compiling a local tool?
Isn't the correct way to natively compile the local tool,
then use it to cross-compile the package?


This illustrates a weirdness of autotools: poor support for
installing interpreted languages, and also conversely for
build-time compiled programs.


Yes, also for coffee-cooking there is poor support only. :-)

I don't think build-time compiled C programs shall be suppored
while cross compiling. I think it already is complex enough.
Otherwise you had to do all checks twice and end up in many
variables with confusing names, and those who are not
cross-compiling probably accidently will mix them.


I though of perl, but (A), i don't like slow tools,


(I think Perl is fast)


(C), i find making build-programs
in C much more concise than scripting and i can debug it in ddd/gdb.


You can debug Perl in DDD.


This is interesting, as it doesn't match mine or
commonly-reported experience (translating my build-time
programs from C to Perl made them shorter, easier to read and
fix, and no slower to run, although I wasn't doing more than
grepping 15k lines of C and writing some of it back out again).



$ time perl -e \
  'for($n=0;$n45;$n++) { printf %08d %60s EOL\n, $n, ; }'  x

real0m0.713s

$ cat x.c
#include stdio.h
int main(void)
{
   int n;
   for(n=0; n45;n++) {
  printf(%08d %60s EOL\n, n, );
   }
   return 0;
}

$ time make x
gcc -Wall -Wmissing-prototypes -fstrict-aliasing -D_GNU_SOURCE -ansi
-ggdb  -ggdb  x.c   -o x
real0m0.076s

$ time ./xx2
real0m0.301s


so 713ms vs. 377 ms.

Interesting that up to around 100k Perl is even faster:

$ time perl \
  -e 'for($n=0;$n10;$n++) { printf %08d %60s EOL\n, $n, ; }'  x
real0m0.167s


$ time make x
real0m0.081s
$ time ./xx2
real0m0.079s


(of course those figures are far away from being exact; they just prove
how fast perl is: same magnitude as C)


Hi,
I'd think that printf() in perl would be mapped to the same printf()
in C lib stdio, and because this is the dominant code, the times are
similar.

What i had in mind is the efficiency of regular expression execution
in perl, compared to hard-coded parsing in C.

I will try perl in ddd/gdb some time.




Re: Building prog first

2010-03-23 Thread Steffen Dettmer
(OT)

On Mon, Mar 22, 2010 at 11:50 PM, John Calcote john.calc...@gmail.com wrote:
 Reuben, you've just hit upon one of the two most significant
 problems with Javadoc and the like (including doxygen, man
 pages, and info pages):

sorry, I cannot leave this, because this would be an excuse for
people `but we have to use Javadoc, so we cannot document well',
which is not true (you said this in your point 2, but I have to
stress it :-)).

It is not a problem of the tools, but of the documentation.
When the main pages in Javadoc and Doxygen documentation are well
written, introduce well, include examples and reference important
functions, who in turn include example code (often telling more
than 1000 words :-)) and again reference functions often needed
in this context, this can help really a lot.

I think:

1) Someone has to know (learn) the API before starting to use it.
   (read documentation, make examples) If there is no good
   documentation and/or no good examples, it would be great to
   write and contribute :-)


2) Documentation should be written aimed at the target audience.
   As other software, it must be structured well, easy to read,
   understand and maintain. Usually it must evolve, first time is
   always bloody.
   Also, it should be tested (e.g. reviewed).

I think often the problem leading to just have documentation like

/**
 * Uses the passed wizard, which must be a Mage, to do the magic.
 */
doMagic(Mage wizard);

is that people agree that documentation is important but didn't
considered well how to do it best. I'm afraid often documentation
is considered something `that has to be done also', quickly by
the side, instead of considering it as one of the most important
parts of the software (it's easy to fix a bug when the
documentation clears how it should be, but it's hard to fix
documentation when the code behaves oddly).

Well, you all know this but I could not resists to write it anyway :)

oki,

Steffen




Re: Building prog first

2010-03-23 Thread Reuben Thomas
On 23 March 2010 06:03, Ralf Wildenhues ralf.wildenh...@gmx.de wrote:
 Hello Reuben,

 * Reuben Thomas wrote on Mon, Mar 22, 2010 at 04:44:17PM CET:
 2010/3/22 Russell Shaw:
  Steffen Dettmer wrote:
  * On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:
  BTW, execution of built programs like this makes your package unsuitable
  for cross-compilation.  Just so you're aware of that.

 Not true.

 Huh?  On the level of implementation Russel is working on, that is very
 much true.  Please read the whole thread; I didn't say this is
 impossible.

I guess it comes down to what you meant by like this. I re-read the
thread and it seems that Russell is trying to do a very similar thing
to what I used to do in GNU Zile, so I'm not sure what about it makes
cross-compilation fail, except some small details that he could fix.

 Why not go for Autoconf?  That is where it should belong, and yes,
 better support for this situation would be very nice to have integrated
 better.

Mostly lack of motivation: this is not a feature I use at the moment.
I will do the cleanup required to get the code into autoconf-archive,
and someone with more motivation will hopefully promote it later.

 Hey, as they say, patches welcome!

When I have finished rewriting GNU Zile in Lua, I suspect I will want
to help with this.

-- 
http://rrt.sc3d.org




Re: Building prog first

2010-03-23 Thread Reuben Thomas
On 23 March 2010 10:15, Steffen Dettmer steffen.dett...@googlemail.com wrote:
 This illustrates a weirdness of autotools: poor support for
 installing interpreted languages, and also conversely for
 build-time compiled programs.

 Yes, also for coffee-cooking there is poor support only. :-)

Sure, but autotools is for building programs, not for making coffee.

 I don't think build-time compiled C programs shall be suppored
 while cross compiling. I think it already is complex enough.
 Otherwise you had to do all checks twice and end up in many
 variables with confusing names, and those who are not
 cross-compiling probably accidently will mix them.

On the contrary, this is a very useful feature (why should one not be
able to build host programs when cross-compiling?) for which support
in autoconf would simplify developers' life (even the ad-hoc support
in binutils that I mentioned is pretty easy to use).

  I though of perl, but (A), i don't like slow tools,

 (I think Perl is fast)

Me too, the above assertion was not written by me! You missed the
author line at the top from the original author of these double-quoted
comments.

-- 
http://rrt.sc3d.org




Re: Building prog first

2010-03-23 Thread Alfred M. Szmidt

   2010/3/22 Alfred M. Szmidt a...@gnu.org:
If searching is the problem

   *Web* searching is the answer, not the problem.

It isn't when you are not connected to a network.

how does the indices not fix the problem?

   I rarely find anything useful in the indices other than particular
   functions or variables. Rarely, in GNU manuals, concepts, but that is
   because they do not, on the whole, have good general indices.

Do you have a list of such manuals? Would you like to report this to
the relevant maintainers?  One cannot improve what one does not know
about.

What about using a info browser to search through the manual?

   I often do that. The trouble is that often what I want to know has to
   be deduced from the manual, which is natural enough, because the
   manual tends to be structured according to the structure of the
   program it documents, rather than of the problems the user is trying
   to solve. By using web searches I can often find people asking and
   answering precisely the problem I'm trying to solve.

Would youlike to suggest a better structuring for some manuals?




Re: Building prog first

2010-03-23 Thread Reuben Thomas
On 23 March 2010 17:15, Alfred M. Szmidt a...@gnu.org wrote:

   2010/3/22 Alfred M. Szmidt a...@gnu.org:
    If searching is the problem

   *Web* searching is the answer, not the problem.

 It isn't when you are not connected to a network.

I usually wait until I am; it often takes me rather longer to answer
questions by simply reading the manuals.

   I rarely find anything useful in the indices other than particular
   functions or variables. Rarely, in GNU manuals, concepts, but that is
   because they do not, on the whole, have good general indices.

 Do you have a list of such manuals?

No, but it's most of the manuals I've looked at.

 Would you like to report this to the relevant maintainers?

No, for several reasons:

1. It's fairly obvious that the indices are in general poor (in
common, I should add, with those of most books ever printed).

2. Such a general feature request (it's not really a bug report) is
not the sort of thing I usually find useful as a maintainer. It's more
useful to notice that I see the matter discussed several times.

3. In practice, I'm really not sure that it's the best use of
maintainers' time: as I say, I can generally solve these problems by
doing web searches, or if not, then posting a question to a mailing
list which hopefully generates a good answer that then becomes
searchable. I think spreading internet access to those who don't have
it is a much more important goal than writing manuals that answer
every question, and further, having better indices would only help
slightly.

 One cannot improve what one does not know about.

True. But this problem is an endemic one, so efforts like Project
Mallard, which aims to improve all GNOME programs, at least, are more
useful than bug reports to specific projects.

 Would youlike to suggest a better structuring for some manuals?

No: as I've already indicated, I simply don't know enough. In particular:

1. I don't know how to improve the structuring of manuals to answer
these questions better.

2. It's unclear to me that changing the structure of manuals would help much.

3. It is almost certain that changing the structure of manuals would
make them less useful for other sorts of use, for example, for users
wishing to learn about a system comprehensively, or those who wish for
a technical reference.

In conclusion:

a. With web search, this problem is not so bad currently.

b. To improve the way documentation is written will require a great
deal of research and experimentation. While individual GNU maintainers
who feel strongly about that may wish to do this for their particular
packages, it seems unwise to me to encourage all maintainers to do
it when it is unclear what it is. Until there is a sense of
emerging consensus and best practice, sticking with the status quo
seems far better to me: GNU manuals are frequently high quality
manuals of what one might call the classic kind, and by imitating
the best of them one will do far better than by trying to guess what
something better but different might be like.

It is possible that I gave the wrong impression either about how
serious the problem is (even for those without internet access,
careful reading and searching of an Info manual will usually find one
the answer eventually), and/or that I gave the impression that I know
how to fix the problem (I have only the vaguest idea). I'm sorry in
either/both cases.

-- 
http://rrt.sc3d.org




Re: Building prog first

2010-03-23 Thread Alfred M. Szmidt
You say that the manuals are poor and that it is obvious, but I cannot
figure out from your explanation how they are poor.  I've looked at a
few manuals, glibc, emacs, coreutils, autoconf, and m4, and all of
them have good indices, are organised cleanly, etc.

Can you mention one or two manuals, and which part of those manuals
you find to be inadequate?

You mention that web access improves the manuals, how do they do that
exactly?  If you do a web search, then you will invariable end up at
the manual, no? 

If our manuals are not read and users think that reporting bugs,
improving them, is a waste of our time, then it would be better that
we remove them, since keeping them updated takes alot of time, more so
than actually improving our programs.  But users clearly need manuals,
as from your experience, and a bad manual is just as much a bug as
anything else in our programs.

Please don't think that improving a manual is any less of an
improvment than adding a very useful feature.




Re: Building prog first

2010-03-23 Thread Reuben Thomas
On 23 March 2010 18:12, Alfred M. Szmidt a...@gnu.org wrote:
 You say that the manuals are poor

I said that the indices are poor, specifically at indexing concepts
rather than just keywords, function names c., in general. I also said
that the manuals in general are excellent.

 and that it is obvious, but I cannot
 figure out from your explanation how they are poor.  I've looked at a
 few manuals, glibc, emacs, coreutils, autoconf, and m4, and all of
 them have good indices, are organised cleanly, etc.

To understand what I mean by a good index, have a look at a book on
indexing, or for a more personal take, along with an exemplar, Douglas
Hofstadter's Gödel, Escher, Bach.

 Can you mention one or two manuals, and which part of those manuals
 you find to be inadequate?

The parts I find inadequate are the indices (as I have said repeatedly).

I have already cited the indices of the autotools manuals, e.g. those
of the autoconf and automake manuals. I've just had another look at
them: they have lists of functions, environment variables c. and each
has a general or concept index, which lists the above, plus, as far
as I can see, a mixture of section headings and the sort of entries
that one might put into a glossary, and not the sort of headings that
bring out the structure of the subject of the manuals. I also
mentioned Emacs's manual, but I see on further investigation that it
doesn't (at least in my version, 23.1, have an index).

 You mention that web access improves the manuals, how do they do that
 exactly?

They take me to answers to specific questions.

  If you do a web search, then you will invariable end up at
 the manual, no?

No, normally I end up on a web page or in a mailing list message.

 If our manuals are not read and users think that reporting bugs,
 improving them, is a waste of our time, then it would be better that
 we remove them, since keeping them updated takes alot of time, more so
 than actually improving our programs.  But users clearly need manuals,
 as from your experience, and a bad manual is just as much a bug as
 anything else in our programs.

I think we're in furious agreement here.

 Please don't think that improving a manual is any less of an
 improvment than adding a very useful feature.

And again!

-- 
http://rrt.sc3d.org




Re: Building prog first

2010-03-22 Thread Steffen Dettmer
* On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:
  noinst_PROGRAMS = unimain
  unimain_SOURCES = unimain.c
 
  unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
./unimain$(EXEEXT) $  $@

 BTW, execution of built programs like this makes your package unsuitable
 for cross-compilation.  Just so you're aware of that.

Assuming unidata.tab.c is a C-code table containing the
information from UnicodeData.txt, I think it could be better to
generate it by some shell code (maybe inside the Makefile.am,
saving a tool) or to use perl (for the price of adding perl to
the build dependencies) or, if UnicodeData rarely changes, add
unidata.tab.c to the package and have some `maintainer only'
helper target to build it (with unidata.tab.c as distributed
source file). People who don't care about unidata.tab.c can build
the package even without UnicodeData.txt (if this makes any
sense, I don't know what this is for of course :))

oki,

Steffen




Re: Building prog first

2010-03-22 Thread Russell Shaw

Steffen Dettmer wrote:

* On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:

noinst_PROGRAMS = unimain
unimain_SOURCES = unimain.c

unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
  ./unimain$(EXEEXT) $  $@

BTW, execution of built programs like this makes your package unsuitable
for cross-compilation.  Just so you're aware of that.


Assuming unidata.tab.c is a C-code table containing the
information from UnicodeData.txt, I think it could be better to
generate it by some shell code (maybe inside the Makefile.am,
saving a tool) or to use perl (for the price of adding perl to
the build dependencies) or, if UnicodeData rarely changes, add
unidata.tab.c to the package and have some `maintainer only'
helper target to build it (with unidata.tab.c as distributed
source file). People who don't care about unidata.tab.c can build
the package even without UnicodeData.txt (if this makes any
sense, I don't know what this is for of course :))


I though of perl, but (A), i don't like slow tools, (B), unidata.tab.c
is 5.6MBytes and 450k lines long, (C), i find making build-programs
in C much more concise than scripting and i can debug it in ddd/gdb.

The size of unidata.tab.c precludes distributing it. I could do
more work on compacting it, but i have already done that to a degree.




Re: Building prog first

2010-03-22 Thread Reuben Thomas
2010/3/22 Russell Shaw rjs...@netspace.net.au:
 Steffen Dettmer wrote:

 * On Sun, Mar 21, 2010 at 10:27 AM, Ralf Wildenhues wrote:
 BTW, execution of built programs like this makes your package unsuitable
 for cross-compilation.  Just so you're aware of that.

Not true. automake does not have explicit support for building
programs with the host compiler when cross-compiling, but I have done
this successfully in the past when I needed precisely to build a
program on the host when cross compiling, using binutils's
BFD_CC_FOR_BUILD macro. It's a pity some version of this macro isn't
in autoconf, or even autoconf-archive; I shall do the latter.

This illustrates a weirdness of autotools: poor support for installing
interpreted languages, and also conversely for build-time compiled
programs.


 I though of perl, but (A), i don't like slow tools,

Purlease...

 (B), unidata.tab.c
 is 5.6MBytes and 450k lines long,

and?

 (C), i find making build-programs
 in C much more concise than scripting and i can debug it in ddd/gdb.

This is interesting, as it doesn't match mine or commonly-reported
experience (translating my build-time programs from C to Perl made
them shorter, easier to read and fix, and no slower to run, although I
wasn't doing more than grepping 15k lines of C and writing some of it
back out again).

-- 
http://rrt.sc3d.org




Re: Building prog first

2010-03-22 Thread Alfred M. Szmidt
If searching is the problem how does the indices not fix the problem?
What about using a info browser to search through the manual?




Re: Building prog first

2010-03-22 Thread John Calcote

On 3/22/2010 4:34 PM, Reuben Thomas wrote:

What about using a info browser to search through the manual?
   
I often do that. The trouble is that often what I want to know has to

be deduced from the manual, which is natural enough, because the
manual tends to be structured according to the structure of the
program it documents, rather than of the problems the user is trying
to solve. By using web searches I can often find people asking and
answering precisely the problem I'm trying to solve.
   


Reuben, you've just hit upon one of the two most significant problems 
with Javadoc and the like (including doxygen, man pages, and info pages):


1. You have to already know the API to know where to look for help on 
the API because the documentation is structured according to the API, 
rather than according to the top 100 use cases.


2. Most people don't add more than method header comments to their 
source code, which means there's often no concept documentation, just 
method documentation, which is useless to people trying to learn the 
API. This isn't always true. Some projects try hard to add concept docs 
too, but just very few by comparison.


Just a comment.

John




Re: Building prog first

2010-03-22 Thread Robert J. Hansen
On 3/22/10 6:50 PM, John Calcote wrote:
 Reuben, you've just hit upon one of the two most significant problems
 with Javadoc and the like (including doxygen, man pages, and info pages):

Agreed -- which is why I think it would be wonderful if there was strong
Autotools support for literate programming.  Unfortunately, my m4-fu is
not strong enough to figure out the appropriate macros to discover
ctangle, cweave, etc.



smime.p7s
Description: S/MIME Cryptographic Signature


Re: Building prog first

2010-03-21 Thread Ralf Wildenhues
Hello Russell,

* Russell Shaw wrote on Sun, Mar 21, 2010 at 07:06:00AM CET:
 I want the unimain program built first, then use it to generate
 unidata.tab.c, which is then compiled and linked into librunicode.la
 
   bin_PROGRAMS = unimain
   unimain_SOURCES = unimain.c

   unidata.tab.c: /usr/share/unicode/UnicodeData.txt
   ./unimain $  $@

Then you need a dependency from unidata.tab.c on unimain:

unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
./unimain$(EXEEXT) $  $@

Furthermore, please don't hard-code absolute paths like
  /usr/share/unicode/UnicodeData.txt

in your makefiles.  Make them configurable by configure.  Maybe your
users don't have root rights on their system but have the file installed
below their home somewhere?

Cheers,
Ralf




Re: Building prog first

2010-03-21 Thread Russell Shaw

Ralf Wildenhues wrote:

Hello Russell,

* Russell Shaw wrote on Sun, Mar 21, 2010 at 07:06:00AM CET:

I want the unimain program built first, then use it to generate
unidata.tab.c, which is then compiled and linked into librunicode.la

  bin_PROGRAMS = unimain
  unimain_SOURCES = unimain.c



  unidata.tab.c: /usr/share/unicode/UnicodeData.txt
  ./unimain $  $@


Then you need a dependency from unidata.tab.c on unimain:

unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
./unimain$(EXEEXT) $  $@

Furthermore, please don't hard-code absolute paths like
  /usr/share/unicode/UnicodeData.txt

in your makefiles.  Make them configurable by configure.  Maybe your
users don't have root rights on their system but have the file installed
below their home somewhere?


Ok.
I did: AC_CHECK_FILE([/usr/share/unicode/UnicodeData.txt], [], [])

In configure, i get:

  if test x$ac_cv_file__usr_share_unicode_UnicodeData_txt = xyes; then :
  fi

Shouldn't this be:

  if test x$ac_cv_file__usr_share_unicode_UnicodeData_txt = xyes; then :
  fi




Re: Building prog first

2010-03-21 Thread Alfred M. Szmidt
   However, make install installs unimain into /usr/local/bin

Please refer to the manual, it documents how to do that, and more.
You can try the chapter `(automake) Fine-grained Distribution
Control'.





Re: Building prog first

2010-03-21 Thread Ralf Wildenhues
* Russell Shaw wrote on Sun, Mar 21, 2010 at 09:26:44AM CET:
 Ralf Wildenhues wrote:
 * Russell Shaw wrote on Sun, Mar 21, 2010 at 07:06:00AM CET:
   bin_PROGRAMS = unimain
   unimain_SOURCES = unimain.c

 unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
 ./unimain$(EXEEXT) $  $@
 
 Ok, that works thanks:)
 
 However, make install installs unimain into /usr/local/bin
 
 How do i stop this program from being installed?

Use noinst_PROGRAMS instead of bin_PROGRAMS.  Be encouraged to read the
fine manual.

BTW, execution of built programs like this makes your package unsuitable
for cross-compilation.  Just so you're aware of that.

Cheers,
Ralf




Re: Building prog first

2010-03-21 Thread Ralf Wildenhues
* Russell Shaw wrote on Sun, Mar 21, 2010 at 08:16:03AM CET:
 Ralf Wildenhues wrote:
 Furthermore, please don't hard-code absolute paths like
   /usr/share/unicode/UnicodeData.txt
 
 in your makefiles.  Make them configurable by configure.  Maybe your
 users don't have root rights on their system but have the file installed
 below their home somewhere?
 
 Ok.
 I did: AC_CHECK_FILE([/usr/share/unicode/UnicodeData.txt], [], [])
 
 In configure, i get:
 
   if test x$ac_cv_file__usr_share_unicode_UnicodeData_txt = xyes; then :
   fi
 
 Shouldn't this be:
 
   if test x$ac_cv_file__usr_share_unicode_UnicodeData_txt = xyes; then :
   fi

First off, no, the two are completely equivalent.  This might not be
clear from a tutorial about shell quoting, but hey, shell quoting isn't
easy.

Then, AC_CHECK_FILE doesn't really help your user (and it kills cross
compilation, too).  If you really want to make this configurable, then
provide a switch or command line variables like
  --enable-unicode-file=location

and if that is not given, search for a few known places where this can
be.  For example, on this system, there exists a file with this name in
  /usr/share/perl/5.8.8/unicore

but I cannot tell you if it has the contents you might expect.

Cheers,
Ralf




Re: Building prog first

2010-03-21 Thread Russell Shaw

Ralf Wildenhues wrote:

* Russell Shaw wrote on Sun, Mar 21, 2010 at 09:26:44AM CET:

Ralf Wildenhues wrote:

* Russell Shaw wrote on Sun, Mar 21, 2010 at 07:06:00AM CET:

 bin_PROGRAMS = unimain
 unimain_SOURCES = unimain.c



unidata.tab.c: unimain$(EXEEXT) /usr/share/unicode/UnicodeData.txt
   ./unimain$(EXEEXT) $  $@

Ok, that works thanks:)

However, make install installs unimain into /usr/local/bin

How do i stop this program from being installed?


Use noinst_PROGRAMS instead of bin_PROGRAMS.  Be encouraged to read the
fine manual.


But it is somewhat big, and i had already searched through the online
one a lot first. It is no wonder it takes noobs so long to get productive.


BTW, execution of built programs like this makes your package unsuitable
for cross-compilation.  Just so you're aware of that.


Ok. I assume then that you can't build the tool for the host system while
the generated files are compiled for the target system?




Re: Building prog first

2010-03-21 Thread Ralf Wildenhues
* Russell Shaw wrote on Sun, Mar 21, 2010 at 11:18:03AM CET:
 Ralf Wildenhues wrote:
 Use noinst_PROGRAMS instead of bin_PROGRAMS.  Be encouraged to read the
 fine manual.
 
 But it is somewhat big, and i had already searched through the online
 one a lot first. It is no wonder it takes noobs so long to get productive.

I'm not sure how to help that.  If more documentation makes people less
likely to look at it, then what would you suggest in order to improve
upon the situation?  Is the documentation not structured well enough?
Does the Autotools Introduction chapter in the Automake manual not
help get a basic grasp?

I agree that the initial learning steps may not be easy for Automake,
but I don't see how to make Automake a lot easier without also ripping
out much of the functionality.  So turning that knob is fairly unlikely.

 BTW, execution of built programs like this makes your package unsuitable
 for cross-compilation.  Just so you're aware of that.
 
 Ok. I assume then that you can't build the tool for the host system while
 the generated files are compiled for the target system?

First off, allow me to clarify the nomenclature as it is used in GNU
software:
- the build system is the one you run configure and 'make all' on,
- the host system is the one that the programs which 'make all' normally
  generates and installs, will run on later,
- the target system does not exist.  Never.  Unless your package happens
  to be a compiler or linker (or similar).  Then, the target system is
  the one for which your installed compiler/linker will generate code
  for.

With that, your sentence above should have been
  Ok. I assume then that you can't build the tool for the build system while
  the generated files are compiled for the host system?

Not straight-forwardly, no.  Once you've got your basic package working
and cross compilation support is the only thing missing, please come
back and we'll explain.

Cheers,
Ralf




Re: Building prog first

2010-03-21 Thread Russell Shaw

Ralf Wildenhues wrote:

* Russell Shaw wrote on Sun, Mar 21, 2010 at 11:18:03AM CET:

Ralf Wildenhues wrote:

Use noinst_PROGRAMS instead of bin_PROGRAMS.  Be encouraged to read the
fine manual.

But it is somewhat big, and i had already searched through the online
one a lot first. It is no wonder it takes noobs so long to get productive.


I'm not sure how to help that.  If more documentation makes people less
likely to look at it, then what would you suggest in order to improve
upon the situation?  Is the documentation not structured well enough?
Does the Autotools Introduction chapter in the Automake manual not
help get a basic grasp?

I agree that the initial learning steps may not be easy for Automake,
but I don't see how to make Automake a lot easier without also ripping
out much of the functionality.  So turning that knob is fairly unlikely.


Hi,
I was limping along for years learning autoconf/make in bits until this
tutorial came out

  Autotools: a practitioner's guide to Autoconf, Automake and Libtool

http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool

I realized a lot of useful things after that. The main thing that makes
it easy is that a real project is stepped through with lots of side discussions,
and high-level overviews put things in to perspective. I'd really like to have
a hard-copy book of that tutorial.

After that, i could understand the autoconf manual. I was on dos/windows
up to nearly yr2000 or so, so i had to learn unix programming, shell
programming, make-file programming, m4, how unix processes work etc,
to be able to look in generated Makefiles and configure and see from
that what errors i was making in configure.ac and automake.am.
Learning too many things simultaneously, but i know now.


BTW, execution of built programs like this makes your package unsuitable
for cross-compilation.  Just so you're aware of that.

Ok. I assume then that you can't build the tool for the host system while
the generated files are compiled for the target system?


First off, allow me to clarify the nomenclature as it is used in GNU
software:
- the build system is the one you run configure and 'make all' on,
- the host system is the one that the programs which 'make all' normally
  generates and installs, will run on later,
- the target system does not exist.  Never.  Unless your package happens
  to be a compiler or linker (or similar).  Then, the target system is
  the one for which your installed compiler/linker will generate code
  for.

With that, your sentence above should have been
  Ok. I assume then that you can't build the tool for the build system while
  the generated files are compiled for the host system?

Not straight-forwardly, no.  Once you've got your basic package working
and cross compilation support is the only thing missing, please come
back and we'll explain.


Ok. Thanks for the help.

--
regards, Russell




Re: Building prog first

2010-03-21 Thread Alfred M. Szmidt
Have you tried reading `(automake) Autotools Introduction'? It is part
of the automake manual.




Re: Building prog first

2010-03-21 Thread Russell Shaw

Alfred M. Szmidt wrote:

Have you tried reading `(automake) Autotools Introduction'? It is part
of the automake manual.


Hi,
I printed out all the autotools manuals and have read every page of
them more than once. It was a while ago, so it's easy to forget things.
Searching the online manual isn't all successful at times, but i've
figured out a fair bit of it now.




Re: Building prog first

2010-03-21 Thread John Calcote

Hi Russell,

On 3/21/2010 6:14 AM, Russell Shaw wrote:

I was limping along for years learning autoconf/make in bits until this
tutorial came out

  Autotools: a practitioner's guide to Autoconf, Automake and Libtool

http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool 



I realized a lot of useful things after that. The main thing that makes
it easy is that a real project is stepped through with lots of side 
discussions,
and high-level overviews put things in to perspective. I'd really like 
to have

a hard-copy book of that tutorial.


Thanks very much for the positive feedback. A much enhanced (and 
somewhat corrected) version of the book is scheduled to be published in 
May 2010 by No Starch Press:


   
http://www.amazon.com/Autotools-Practioners-Autoconf-Automake-Libtool/dp/1593272065


Best regards,
John



After that, i could understand the autoconf manual. I was on dos/windows
up to nearly yr2000 or so, so i had to learn unix programming, shell
programming, make-file programming, m4, how unix processes work etc,
to be able to look in generated Makefiles and configure and see from
that what errors i was making in configure.ac and automake.am.
Learning too many things simultaneously, but i know now.