On 22.05.13 18:30, Chris Barker - NOAA Federal wrote: > Users also fall into two categories: > > 1) Folks that do Python development on OS-X much like Linux, etc -- > these folks are likely to use macports or homebrew, or are used to the > .configure, make, make install dance. We don't need to do anything to > support these folks -- "pip install" generally works for them. > > 2) folks that want to use a Mac like a Mac, and people that develop > for those folks -- these people need binary installers, and may want > to be able to use and deploy either packages or applications (Py2app) > that will run on systems older than the one developed on, or want > universal builds, or ??? > - These are the folks I'd like to support, but I'm still unsure as > to how best to do that.
I agree, it would be a nice thing to have such a binary repository again. Thanks for trying to tackle this! >From a user's point of view, I find that Windows installers as generated by bdist_wininst still provide the nicest user experience with OSX packages being a close second. From your user category description it also sounds like this would be the type of user experience that would be suitable for type 2 users. You even mention pip as a solution to type 1 users and again, I do agree with this. That's why I find it a bit surprising that in the remainder of this thread, a lot of the discussion is about pip and virtualenv (as far as I can tell, all the solutions that were mentioned were command line solutions), even though you actually didn't want to target this category of users. > How should the dependencies be distributed? > > 1) They should be built to match the Python binary being targeted > (honestly, I think that's now only the Intel 32-64 bit ones -- PPC > machines, and pre 10.6, are getting really rare...) Sounds all right to me (it wasn't actually too long ago that I was still on 10.4 and I can confirm, it's no fun anymore :) ) > 2) Static or dynamic? > > IIUC, most successful binary packages for the Mac have relied on > statically linking the dependencies -- this works, and is pretty > robust. However, it can be kind of a pain to do (though I've finally > figure how to do it more reliably!). Also, it seems like a waste to me > for packages that use common dependencies -- how many copies of libpng > do I really want linked into my single instance of Python at run time? Personally, that doesn't really bother me too much, at least not in the context of Python development. It's not that all the packages I'm using are linked against libpng and that I need all those packages in the same program. > But if dynamic, where do you put them? We'll still want to ship them > with the binary, so people have a one-click install. If static linking is not an option and you need to ship a dynamic library, I would favor a self-contained solution where the dynamic library is just stored in the same directory where the Python extensions are that are actually using the library. It seems that with the @loader_path mechanism you were mentioning in another email, it shouldn't be a problem to implement this. I wouldn't bother trying to share libraries across packages (or Python installations) for the following reasons: - By moving the dynamic library outside the "domain" of the package, it has become an external dependency and the package provider cannot guarantee anymore that the package will really work on the user's machine throughout its lifetime. Essentially, the responsibility of ensuring compatibility has been moved from the package provider to the user. - A build of a library may differ from another build of the same library, even when they are built from the same version of the source files. This is because it's not uncommon that libraries can be configured at compile time (e.g. wide character support, single-threaded vs multi-threaded, float vs double as fundamental number type, optional support for any SSE variant, AVX, OpenGL, OpenCL, networking, internationalization, etc.). So it could happen that I install package A and everything is fine until at some point I install package B which overwrites a shared dynamic library and suddenly package A doesn't work anymore. - Uninstalling is more straightforward as I can just remove the main directory where the package is in and I don't have to worry about orphan libs in some other directories of which I don't even know why they are on my system. - If the entire package is self-contained, moving the package or changing the install location at install-time is not an issue anymore. - It makes life easier for the package provider as well as you don't have to worry how to manage shared libraries. Cheers, - Matthias - _______________________________________________ Pythonmac-SIG maillist - Pythonmac-SIG@python.org http://mail.python.org/mailman/listinfo/pythonmac-sig unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG