https://gcc.gnu.org/bugzilla/show_bug.cgi?id=65294
Bug ID: 65294 Summary: No easy way of setting default Mac OS X target - darwin_minversion not enough Product: gcc Version: 4.6.4 Status: UNCONFIRMED Severity: normal Priority: P3 Component: driver Assignee: unassigned at gcc dot gnu.org Reporter: astrand at cendio dot se Host: i686-pc-linux-gnu Build: i686-pc-linux-gnu Target: x86_64-apple-darwin10 Mac OS X SDK: 10.6 We want to configure GCC so that OS X 10.6 is the default target. For this, we changed DARWIN_MINVERSION_SPEC to be "10.6". This seemed to work - __ENVIRONMENT_MAC_OS_X_VERSION_MIN_REQUIRED__ is 1060. However, we later then realized that calling gcc with -mmacosx-version-min=10.6 gave a different behaviour. For example, in the latter case, ld is called with -no_compact_unwind. In other words, changing DARWIN_MINVERSION_SPEC / %(darwin_minversion) to say 10.6 causes GCC to run with a split personality, where some parts thinks it's 10.4 and others thinks it's 10.6. In order to solve this, either a lot of spec logics needs to be changed, or the -mmacosx-version-min argument needs to be "faked". This is actually done when the environment variable MACOSX_DEPLOYMENT_TARGET is found - but not when cross compiling. Any good reason for this? Also, it seems to me that DARWIN_MINVERSION_SPEC / %(darwin_minversion) is not that useful, when most of the specs are checking the -mmacosx-version-min argument anyway. The patch from gkeating (https://gcc.gnu.org/ml/gcc-patches/2007-02/msg01484.html) had the explicit goal of "fix or avoid a bunch of bugs where one part or another of the toolchain differed from the compiler in the minimum system version to be targetted.", but this is exactly the problem we are seeing here, thus I believe some additional work is required.