Marcel Moolenaar wrote:
> > Here is what I think:
> >
> > Your proximal problem is that your libraries are badly organized, and
> > therefore certain object files in them are not being pulled into the
> > linking process, because your order of operation on the objects is not
> > in dependency order, because of the improper organization.
> A challenge:
> Linkers normally pull in everything they can from archive libraries

Actually, they don't.  They only pull in onjects that define symbols
that are undefined at the time the library is encountered, in order,
on the linker line.  Anyone who doesn't believe this needs to write
an X11 application that uses Xt, Xext, and some widget toolkit, and
then play with library order other than "-lX11 -lXt -lXext".

> and do not require that object files in archive libraries be ordered
> in dependency order, nor do they require archive libraries to contain
> object files multiple times to break circular dependencies. They do
> this by iterating over the archive library until no new binding is
> possible (whether it's iterating over the index or over the whole
> archive).

Actually, they do this by looking at the library symbol index, which
is either created automatically by the ar, or, on BSD based systems,
added by the program "ranlib".

> If you think that providing bits on the link line in dependency order
> is a natural way of linking and the "proper" way of doing it, how do
> you explain our improper use of putting object files in lexical order
> in libraries and how do you resolve the contradiction that from a build
> point of view the lexical order is the proper way of building and we
> only get away with that because the linker doesn't require object
> files in archive libraries to be in dependency order (or we manually
> correct the situation by duplication)?

I explain the lexical ordering by way of the following commands when
exiting the Makefile in "vi" in command mode:

!!ls *.c


> Also, to me it looks like a gross inconsistency that can be easily
> solved by having the linker remember symbols it has seen (and where)
> even though they are not unresolved at the time the symbols are seen.

"info ld"

This is not historical UNIX "ld" behaviour, and it is not default GNU
"ld" behaviour.

> How does reordering or restructuring source code solely to make the
> linker happy be in anyway better than simply make the linker less
> dumb (be it optional)?

When you are using a library as a library, rather than as a silly
workaround to command line length limitations, you only want to
pull in the object files from the archive which are actually used.

By ordering the source code properly, less code gets pulled in when
you are not actually using every function within a library.

Linking fewer object files into an executable makes the executable

Smaller executables are better than larger executables from a
putatively "smarter" linker (personally, I measure linker intelligence
as inversely proportional to the resulting executable size, relative
to the idealized executable size).

Also, putting related code adjacently results in the code fitting
within the peephole.  This permits the cimplier's optimizer to do
things that it would otherwise be unable to do.

Optimized executables are better than those that aren't.

So organizing functions into the correct object modules, and the object
modules into correct libraries, rather than choosing the organization
at random, is important, if you care about code size and/or optimizer

> I don't intend to start a discussion, just contemplation when I say:
> The reason linkers behave the way they do does not necessarily have
> to be a good one according to current standards. I've often wondered
> about what makes the current behaviour good and have never found a
> reason better than "it's easier for the linker". This however can
> easily be rejected as unimportant, because tools are supposed to make
> it easier for the user. To me the behaviour of linkers is therefore
> mostly hysterical and I personally would not use it as an argument
> to distinguish good source organisation from bad...

The most common excuse I'm aware of is "To get faster compile times
when benchmarked against other compilers".

On slow enough, or emulated, hardware, though, it's a legitimate
complaint that linking speed becomes a developement bottleneck.

> > Most linkers don't do what you want, which is make up for programmer
> > incompetence by doing an automatic topological sort on all symbol
> Which programmers do you mean: the programmers writing linkers or...?
> :-)

I had a big gripe, complete with examples involving famous names,
ready to go.  But I will replace it with a much smaller response:

"A craftsman must know his tools".

-- Terry

To Unsubscribe: send mail to [EMAIL PROTECTED]
with "unsubscribe freebsd-current" in the body of the message

Reply via email to