I just poked a bit into the Anaconda Python distribution. their
packages are simple tarballs, but I think they have a dependency
management system of some sort.

They deliver the dependencies as separate packages (tarballs), one for
each lib. It looksl ike it all gets unpacked inot a sinlgle dir (in
/opt), and an example python extension is built like this:

$ otool -L netCDF4.so
netCDF4.so:
        @loader_path/../../libnetcdf.7.dylib (compatibility version 10.0.0,
current version 10.0.0)
        @loader_path/../../libhdf5_hl.7.dylib (compatibility version 8.0.0,
current version 8.3.0)
        @loader_path/../../libhdf5.7.dylib (compatibility version 8.0.0,
current version 8.3.0)
        @loader_path/../../libz.1.dylib (compatibility version 1.0.0, current
version 1.2.7)
        /usr/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 
1.0.0)
        /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current
version 111.0.0)


I don't know how to get that @loader_path thing in there, but this
seems like a reasonable way to do it (though I guess it wouldn't
support virtualenv...)

-Chris



On Wed, May 22, 2013 at 3:46 PM, Chris Barker - NOAA Federal
<chris.bar...@noaa.gov> wrote:
> Thanks Ronald,
>
> On Wed, May 22, 2013 at 2:53 PM, Ronald Oussoren <ronaldousso...@mac.com> 
> wrote:
>
>> To move back onto topic, not relying on unix-level libraries in OSX is in a 
>> good thing as it makes it easier to support multiple OSX versions with a 
>> single set of binaries.
>
> hmm -- I figured if it was a system lib, it should work on whatever
> system It's running on. For example, I'm working right now on the
> netcdf4 lib -- it required hdr5, which requires zlib. I"m using the
> system zlib -- is that a bad idea? Should I build it too, to make sure
> it matches the rest of it?
>
> (I do want the binaries to run anywhere the binary Python I'm using runs)
>
>
>> Except for a number of more complicated libraries (such as PIL/Pillow) when 
>> using universal binaries (when using 'pip install', homebrew/macports/... 
>> have their own mechanisms for building).
>
> right -- Universal libs are not well supported by those systems -- but
> that's the power users problem!
>
>>> 2) folks that want to use a Mac like a Mac, and people that develop
>>> for those folks --  these people need binary installers, and may want
>>> to be able to use and deploy either packages or applications (Py2app)
>>> that will run on systems older than the one developed on, or want
>>> universal builds, or ???
>>>  - These are the folks I'd like to support, but I'm still unsure as
>>> to how best to do that.
>>
>> It would be nice to have a set of binary "packages", based on a reproducable 
>> build system.
>
> Exactly what I'd like to build!
>
>>> Way back when Bob Ippolito maintained a repository of binary packages
>>> for the mac -- it was a great resource, but he's long since moved on
>>> to other things.
>>
>> The binary packages that Bob maintained had IMHO two major problems:
>>
>> 1) The largest problem is that the packages were AFAIK created ad-hoc (Bob 
>> or some other contributor did the magic incantations to build library 
>> dependencies)
>
> Yeah, and he never gave anyone else permission to push to it...
>
>> 2) The packages were Installer.app packages. The current state of the art 
>> for development/project environments is to use virtualenv or buildout to 
>> create separated python installations and install all project depedencies 
>> there instead of the global site-packages directory. That's not something 
>> that's easily supported with Installer.app packages.
>
> It was the way to go at the time, but I agree a binary format that
> supports virtualenv would be great.
>
>>> do I really want linked into my single instance of Python at run time?
>>
>> As long as the libpng state isn't shared static linking isn't really a
>> problem.
>
> good to know, but somehow it still offends my sensibilities
>
>> Dynamic linking has at least two disadvantages:
>>
>> 1) Relocation of the installation prefix is harder due to the way the 
>> dynamic linker on OSX looks for libraries:
>
> yeah -- that is a pain.
>
>> The header is easily updated using macholib, but that makes installation
>> harder and isn't supported by the standard packaging tools (easy_install
>> and pip)
>
> But if we put the shared libs in amore central location, then all your
> virtual-ens could use the same ones, yes?
>
>> 2) The primary use case for dynamic linking is to share dylibs between 
>> extensions, and when those extensions are in different PyPI packages the 
>> packaging story gets more complicated. The easiest workaround is to ignore 
>> sharing dylibs and still bundle multipe copies of libpng if two different 
>> PyPI packages both link with libpng.
>
> when you say bundle, do you mean static link? Or just package up the
> dylib with the bundle, which is what i was thinking -- each package
> installs the libs it needs, which may or may not already have been
> installed by another package -- but so what?
>
> And I expect the number of folks building packages will be fairly
> small, so one builder would one have to build one set of dylibs.
>
>>> But if dynamic, where do you put them? We'll still want to ship them
>> A new framework isn't necessary. There are three locations that could easily 
>> be used:
>>
>> 1) A directory in Python.framework, for example 
>> /Library/Frameworks/Python.framework/Frameworks
>
> That makes sense to me.
>
>> 2) A directory in /Library/Python, for example /Library/Python/Externals
>
> that feels a bit lke Apple's turf, but what do I know?
>
>> 3) As 2), but in the users home directory (~/Library/Python/Externals)
>> The latter is the only one where you can install without admin privileges.
>
> But we put the python binaries  in /LIbrary/Frameworks -- it seems we
> should do the same with libs...
>
>
>> The folks over on distutils-sig are working towards support for wheels (PEP 
>> 427, <http://www.python.org/dev/peps/pep-0427/>) at least in pip and 
>> distribute/setuptools and possibly in the stdlib as well (for 3.4). It would 
>> be nice if the OSX package collection would be in wheel format, that would 
>> make it relatively easy to install the packages using the defacto standard 
>> tools.
>
> Any idea what the time scale is on this?
>
>> What I haven't looked into yet is how easy it would be to configure pip to 
>> look for packages on PyPI and then look for binaries (wheels) in some other 
>> location.
>
> Have the pip folks made any commitment at all to supporting binary
> installs? That's a big missing feature.
>
>>> Note that I've used the term "we" here ;-)  I'm hoping that others
>>> will join me in following a convention and getting stuff out there,
>>> but even if not, I'd love feedback on how best to do it.
>>
>> Good luck :-).  This is fairly boring low-level packaging work, and that 
>> tends to put off people. At least it is a lot easier than trying to fix or 
>> replace distutils, that has burned out at least 3 generations of developers 
>> ;-/
>
> Well, I'm an optimist -- and recently at least you and Ned and Russel
> Owen have bee known to contribute.
>
>>> By the way, the other goal is to build scripts that do the builds the
>>> way we need for various libs, packages, etc, so that it's easy to do
>>> it all when new builds are required...
>>> (maybe use gattai? -- http://sourceforge.net/projects/gattai/)
>>
>> I haven't used gattai before, but at least it is Python code. Building tools 
>> for this from scratch would also not be very hard, I already done so a 
>> number of times (such as the build-installer.py script in the CPython 
>> repository).
>
> yup -- me too, though I find myself wanting to add various make-like
> features, and it dawns on me that I am re-inventing the wheel! So I
> want to give Gattai a shot. Also Kevin is likely to be helpful.
>
>> The hard part should be setting up the build infrastructure and build 
>> scripts, once couple of relatively hard packages like PIL or wx have been 
>> processed adding more packages and new versions of packages should be easy.
>
> Exactly. But I don't know about wx -- that is a bear, and Robin's been
> doing a fine job!
>
> Thanks for you ideas,
>
>   -Chris
>
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R            (206) 526-6959   voice
> 7600 Sand Point Way NE   (206) 526-6329   fax
> Seattle, WA  98115       (206) 526-6317   main reception
>
> chris.bar...@noaa.gov



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
_______________________________________________
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG

Reply via email to