On Thu, Aug 07, 2025 at 04:54:09PM +0200, Branko Čibej wrote:
> Why? Why are we telling our library consumers about debugging, optimisation
> and warning flags? This makes no sense to me and is actively detrimental. In
> Serf, for example, we go through extra hoops to remove this crud from the
> APR flags; and I believe Subversion does something similar. APR should not
> dictate how its consumers are compiled. The --cppflags, sure, those are
> necessary on some platforms in order to make the resulting linked code work;
> but --cflags?

Not sure I can answer all of this and don't want to defend the current 
design particularly, but:

In general, how do you distinguish "debugging, optimisation and warning" 
flags from compiler flags which might impact ABI choices at cc or libc 
level?  e.g. -DNDEBUG is a "debugging flag." but -D_FILE_OFFSET_BITS=64 
is a libc ABI choice. -O2 is a optimisation flag, for *bsdi* in the 
hints, -m486 is... possibly an ABI choice?

Historically part of the problem APR solved was "pick a set of compiler 
flags for weird Unix platforms which give me a compiler which DTRT, and 
apply them everywhere". So, CFLAGS was part of that. There was/is a 
blurred line between "picking good defaults" and allowing users to 
pick/override those, and this stuff was not really designed/documented 
in sufficient detail.

(Building on 32-bit Linux with -D_FILE_OFFSET_BITS=64 and on 64-bit 
platforms with gcc -m32 were always two good tests of ABI consistency 
via inherited $CC or $CFLAGS/$CPPFLAGS)

Regards, Joe

Reply via email to