On 07 Mar 2014, at 01:38, Ned Deily <n...@acm.org> wrote:

> In article 
> <calgmxejblpxdfan3xwrqqdtdamgy+djzojfbyapmtu1euan...@mail.gmail.com>,
> Chris Barker <chris.bar...@noaa.gov> wrote:
> 
>> The numpy folks are trying hard to get binary wheel sup on PyPi yeah!
>> 
>> But this brought hup an issue -- PyPI policy is that binary wheels should
>> be built for the python.org binaries -- which is great.
> [...]
>> So is it officially supported to link a *.so built against a older SDK to a
>> application buit with a newer SDK? If so ,then this seems a
>> nice convenience for the MAc crowd.
> 
> I've just commented on this in more detail on the pip issue tracker:
> 
> https://github.com/pypa/pip/pip/pull/1465
> 
> The short answer is that there are at least two issues here: older SDK 
> vs newer SDK and python.org Python vs Apple-supplied system Python.  In 
> general, we don't officially test these combinations nor claim to 
> support them but, for many cases, they do seem to work OK, modulo some 
> known potential gotchas (like mixing C++ runtimes on 10.9).  We do 
> specifically support that current python.org Pythons built on 10.6 work 
> on 10.6 through 10.9 and that it is possible to build C extensions on 
> any of them that will work on at least that installed OS X version.  In 
> other words, if you want to build a C extension module that works with 
> 10.6 through 10.9, the safest approach is to use the 10.6 SDK.  If you 
> only need it to work on your current system, the installed system 
> headers should be OK, which Distutils falls back to if the SDK is not 
> available in its traditional location.

But note that it is not necessary to use the 10.6 SDK, it is just slightly 
safer and in particular when using a build system that does autoconfiguration 
based on compile/link checks: that could find newly introduced symbols that are 
present in the latest SDK, but not on the older system.

Another potential problem is linking to system shared libraries that aren’t 
Apple proprietary, such as openssl. When using a newer SDK you’ll generally 
link with the newest version of those shared libraries that are included in the 
newer system and those might not be available on older systems. The workaround 
for that is to not link to those libraries at all.  This generally isn’t a 
problem for Apple libraries (such as the frameworks or libSystem) because there 
the name of the shared library doesn’t change between OSX releases.

In general I’d be fine with:

* PIP accepting a wheel with deployment target 10.6 on a Python with deployment 
target 10.9
* PIP accepting a wheel with more architectures than supported by the Python 
build (e.g. a wheel with “intel” on a x86_64 only build)

That should be generally safe, with the exceptions listed above. 

It could be nice to have options for alternatives as well, but those should be 
optional and not enabled by default because they break some use cases (such as 
building an application on a new machine and deploying it on an older machine):

* PIP accepting a wheel with a deployment target <= the currently running OSX 
version (e.g. install a 10.8 wheel on a machine running OSX 10.9, irrespective 
of the deployment target for the Python interpreter)
* PIP accepting any wheel that contains the CPU architecture for the currently 
running architecture (e.g. installing a x86_64 only wheel on an “intel” Python 
installation when running on a newish machine)

Ronald
> 
> -- 
> Ned Deily,
> n...@acm.org
> 
> _______________________________________________
> Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
> https://mail.python.org/mailman/listinfo/pythonmac-sig
> unsubscribe: https://mail.python.org/mailman/options/Pythonmac-SIG

_______________________________________________
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
https://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: https://mail.python.org/mailman/options/Pythonmac-SIG

Reply via email to