The advice I've read in several posts on the subject involve everything
from setting one, setting both, to ignoring both, sometimes with the =?
notation and sometimes without. And then, I've read comments that suggest
when compiling the kernel, for example, both are ignored, and default
values (tucked away somewhere) are always applied. IIRC, the handbook
recommends at least setting CPUTYPE.
My question isn't a holy grail type of quest for maximised performance,
but concerns the meaning of those settings with respect to
building world, building kernel and anything in ports. Put another way,
I'm not a computer science major, but do have different systems that I
compile for, and I'd like to have a better understand WTF I'm really doing.
For example, what is the difference, if any, between a binary compiled
compares with compiling it using:
compares with compiling it using what I think are the universal defaults
CFLAGS=-O2 -fno-strict-aliasing -pipe
that get applied if the make.conf is blank? Can the resulting binary be
run on each other's system? Or is it simply optimised to run on one,
versus another? Or are those settings relevant to the compilation process
only? Or to both the compilation process and the actual performance of
the binary? Or should I be taking the dog for a nice long walk instead of
watching scrolling compiler output? ;-)
If someone could take a moment to explain in moderately technical terms
what all the above means, or suggest a source for further reading, I'd be
firstname.lastname@example.org mailing list
To unsubscribe, send any mail to "[EMAIL PROTECTED]"