On Thu, Jan 14, 2016 at 10:58 AM, Matthew Brett <matthew.br...@gmail.com> wrote:
> > but neither the culture nor the tooling support that > > approach now, so I'm not very confident you could gather adoption. > > I don't think there's a very large amount of cultural work - but some > to be sure. > > We already have the following on OSX: > > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py > > where all the wheels come from pypi. So, I don't think this is really > outside our range, even if the problem is a little more difficult for > Linux. > I'm actually less concerned about the Linux issue, I think that can be solved reasonably with "manylinux" -- which would put us in a very similar position to OS-X , and pretty similar to Windows -- i.e. a common platform with the basic libs (libc, etc), but not a whole lot else. I'm concerned about all the other libs various packages depend on. It's not too bad if you want the core scipy stack -- a decent BLAS being the real challenge there, but there is enough coordination between numpy and scipy that at least efforts to solve that will be shared. But we're still stuck with delivering dependencies on libs along with each package -- usually statically linked. > I thought that Anaconda.org allows pypi channels as well? I think you can host pip-compatible wheels, etc on anaconda.org -- though that may be deprecated... but anyway, I thought the goal here was a simple "pip install", which will point only to PyPi -- I don't think there is a way, ala conda, to add "channels" that will then get automatically searched by pip. But I may be wrong there. So here is the problem I want to solve: > pip install numpy scipy matplotlib scikit-learn scikit-image pandas h5py last I checked, each of those is self-contained, except for python-level dependencies, most notably on numpy. So it doesn't' help me solve my problem. For instance, I have my own C/C++ code that I'm wrapping that requires netcdf (https://github.com/NOAA-ORR-ERD/PyGnome), and another that requires image libs like libpng, libjpeg, etc.( https://github.com/NOAA-ORR-ERD/py_gd) netcdf is not too ugly itself, but it depends on hdf5, libcurl, zlib (others?). So I have all these libs I need. As it happens, matplotlib probably has the image libs I need, and h5py has hdf5 (and libcurl? and zlib?). But even then, as far as I can tell, I need to build and provide these libs myself for my code. Which is a pain in the @%$ and then I'm shipping (and running) multiple copies of the same libs all over the place -- will there be compatibility issues? apparently not, but it's still wastes the advantage of shared libs, and makes things a pain for all of us. With conda, on the other hand, I get netcdf libs, hdf5 libs, libpng, libjpeg, ibtiff, and I can build my stuff against those and depend on them -- saves me a lot of pain, and my users get a better system. Oh, and add on the GIS stuff: GDAL, etc. (seriously a pain to build), and I get a lot of value. And, many of these libs (GDAL, netcdf) come with nifty command line utilities -- I get those too. So, pip+wheel _may_ be able to support all that, but AFAICT, no one is doing it. And it's really not going to support shipping cmake, and perl, and who knows what else I might need in my toolchain that's not python or "just" a library. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception chris.bar...@noaa.gov
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion