[Distutils] setuptools and scripts on Windows

2012-03-05 Thread Chris Barker
Hi folks,

IIIC, there has been a recent change in the setuptools executable that
is distributed with the Windows installer (and source). This
executable is used as a stub to launch scripts that have been
installed using distutils scripts system -- i.e put into
C:\Python27\Scripts, or similar)

The problem is that the new one launches the scripts in a separate
terminal window (DOS box), which then quits when the script is
finished, either due to an error or normally. The result is that you
can'st see the output of the scripts, which makes any command line
script pretty much useless. This means that things like nosetests, etc
is broken.

Anyway, after much gnashing of teeth, I noticed that if I install the
lastest distutils, I get ann easy_install.exe that is about 65kb in
size, while an older version had a 7kb exe. If I drop the older on
into the scripts dir -- it works fine (tested on 32 bit Windows XP and
64 bit Windows 7 (with 32 bit pyhton). In fact, future easy-installed
(or other setuptools installations) scrips then work correctly as well
-- it appears that setuptools copies the easy_install.exe (and renames
it) to launch the other scripts.

Anyway, after poking around a bit, it's not clear to me who is
maintaining setuptools, whether this change was intentional, or the
result of other needed changes, etc. I can't find a version controlled
source where I can look for changes.

But I'd really like to see it put back the way it was!

This list appeared to be the appropriate place to discuss setuptools,
but where do I go from here? -- i.e. where to file a bug report, etc.

Thanks,

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] RFC: Binary Distribution Format for distutils2/packaging

2012-03-14 Thread Chris Barker
On Wed, Mar 14, 2012 at 9:17 AM, Paul Moore p.f.mo...@gmail.com wrote:
 It's reasonable to argue that this is only a windows problem.

no -- it's a Mac OS-X problem, too. Indeed, a harder one, due to:

A) The Mac platform now has 4! architectures: PPC, PPC64, x86,
intel64. Granted, PPC is almost dead, PPC64 never saw much use, and
even 32 bit x86 is on the way out. Never the less -- at least 32 and
64bit intel are going to be around for a while.

B) OS-X support fat binaries, and the pyton.org binaries have been
built that way for ages.

C) OS-X moves forward, fast -- it's gets pretty ticky to build
binaries that run on older versions than the one you are building on
(it can be done, to a point, but it's hard to get right)

 - Consumers of applications should get application installers,
  possibly with embedded copies of Python.

sure -- but people developers need to build those...

 From my experience, that happens more often on Windows than elsewhere
 (py2exe/cx_Freeze). I didn't think Unix people did that.

not as much, but Mac developers do.

 - Consumers of libraries are developers who should be able
  to install development tools.

In theory, yes, but:

1) there are folks that want to do a little python that don't have any
experience or interest in the whole C building thing -- and to get the
compiler on the Mac, you need to register with Mac Developer
connection, then download a MASSIVE binary -- it's not a trivial lift.

2) as a stated above, building binaries of packages that you can
re-distribute to other systems (py2app) is tricky -- even more so when
you need compiled dependencies (libjpeg, libfreetype, that sort of
thing)

So the short version is -- binary packages are important on the Mac.

Which brings up another point:

Universal binaries (i.e. more than one architecture in one binary)
have never been properly supported by binary eggs/setuptools. I think
it may be as simple as the naming convention -- the binary would be
named according to the machine it was built on (i.e. PPC) but when you
tried to install in on another machine, setuptools would look for one
called, i.e. x86 and not find it. There may be some other issues,
too, but in any case,  we need to make sure the naming convention
handles the multi-architecture case as well.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] RFC: Binary Distribution Format for distutils2/packaging

2012-03-14 Thread Chris Barker
On Wed, Mar 14, 2012 at 3:05 PM, Zvezdan Petkovic zvez...@computer.org wrote:
 1) there are folks that want to do a little python that don't have any
 experience or interest in the whole C building thing -- and to get the
 compiler on the Mac, you need to register with Mac Developer
 connection, then download a MASSIVE binary -- it's not a trivial lift.

 Not any more.
 You can install XCode from Mac App store as an application now instead of an 
 installation bundle/image.

nicer -- though I imagine it's still a huge download.

 Also, the membership in the Mac Developer connection is not required.
 Only if you are going to sign apps for Mac or iOS you need the membership.  
 For a casual user who needs a C compiler and finds XCode on the Mac App
store it's not required.

nice to know.

However, the fact remains that there are folks that just want to write
some Python code -- Also, aside from the compiler, it's still can be a
pain to build stuff that requires dependencies (yes, I know about
homebrew, macports, fink -- the point still stands)

I think it's important to support that part of the community.

NOTE: on Windows, you can simply install the right version of Visual
Studio Express, and get building capability there, too -- but there is
still a real desire for binaries.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] RFC: Binary Distribution Format for distutils2/packaging

2012-03-15 Thread Chris Barker
 On 14 March 2012 19:04, Tarek Ziadé ta...@ziade.org wrote:

 Why would someone create a binary release when
 it's pure Python ?

There are a lot of users (Windows and Mac anyway) that like a nice
point+click installer, and don't know (and shouldn't have to) whether
there is any compiled code in there. It's nice to support that.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-18 Thread Chris Barker
Someone on another list indicated that pip installing binary wheels
from PyPi will ONLY work for Windows.

Is that the case? I think it's desperately needed for OS-X as well.

Linux is so diverse that I can't imagine it being useful, but OS-X has
only so many versions, and the python.org OS-X binaries are very clear
in their requirements -- it would be very useful if folks could easily
get binary wheels for OS-X

-Chris




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-21 Thread Chris Barker
On Fri, Oct 18, 2013 at 6:22 PM, Nick Coghlan ncogh...@gmail.com wrote:
  -- it would be very useful if folks could easily
 get binary wheels for OS-X

 We do plan to support it, but the pip devs uncovered a hole in the current
 wheel spec that means it generates the same filename on *nix systems for
 wheels that need to have different names for the download side of things to
 work properly

THanks -- but really? don't OS-X wheels get:

macosx_10_6_intel

or some such tacked on? Where does that go wrong?

 Once ensurepip has landed in Python 3.4 and pip 1.5 is released, we should
 be able to get back to updating the various metadata specs, with the aim of
 getting cross-platform wheel support in pip 1.6 :)

Getting there... thanks,

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-22 Thread Chris Barker
On Mon, Oct 21, 2013 at 11:52 AM, Donald Stufft don...@stufft.io wrote:

 Thanks -- but really? don't OS-X wheels get:

 macosx_10_6_intel

 or some such tacked on? Where does that go wrong?

 Homebrew, Mac Ports, Fink. That would work OK if nobody ever installed things
 that the system didn't provide.

OK -- yes, that will NEVER work. It's worse than the Linux situation.

But then, it's the same everywhere -- if someone builds a binary wheel
for Windows that depends on some non-standard dll, or is built against
a weirdly custom-built Python, it won't work either.

It's been more or less a consensus in the python-mac community that we
seek to provide binaries for the Python.org pythons, and that they
shouldn't depend on non-standard external libs -- just like on
Windows. Major hard-to-build packages have been doing this for years:

wxPython
numpy, scipy, matplotlib

But these installers often don't work with virtualenv, and can't be
discovered or installed  pip or easy_install.

So I think it would be VERY useful if we established this standard for
PyPi and binary wheels.

macports, fink, and homebrew have been doing their own thing for ages,
and can continue to do so -- they HAVE package management built in,
just like the linux distros. If they want to do wheels, they will need
to make sure that the neccesary info is in the platform-tag. On my
python.org build:

'macosx-10.6-i386'

so they should patch their python to return something like:

'macosx-macports-10.6-i386'

or just:

'macports-10.6-i386'

and probably a macports version, rather than 10.6.

However, the _point_ of macports, homebrew, etc, is that your stuff is
all custom compiled for your system the way you have configured it --
so binary wheels really don't make sense.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-22 Thread Chris Barker
On Tue, Oct 22, 2013 at 1:19 PM, Nick Coghlan ncogh...@gmail.com wrote:
 PEP 453 has had most of my attention lately, but my tentative thought has
 been to introduce a relatively freeform variant field to the wheel spec.

 Windows and Mac OS X would then default to an empty variant, while other
 *nix systems would require a nominated variant

 That would then cover things like SSE builds on Windows, alternative sources
 on Mac OS X, etc.

sounds good.


 However, worrying about that is a fair way down the todo list until
 ensurepip and the CPython doc updates for PEP 453 are done, along with
 everything else that has a 3.4b1 deadline :)

Fair enough, but enabling binary wheels for OS-X (the pyton.org build,
NOT macports and friends) would be great, and shouldn't take much
work, yes?

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Egg name computation

2013-10-31 Thread Chris Barker
On Mon, Oct 28, 2013 at 3:50 PM, PJ Eby p...@telecommunity.com wrote:

 You could include a dummy extension that does nothing, I suppose.  Or
 which controls the building of your actual extensions.  Setuptools has
 long supported Pyrex and I think that Cython might also work, i.e.,
 that you could just specify your cython modules as extensions in
 setup.py to start with.


Indeed -- recent versions of setuptools do support Cython. You also may
want to use Cython's cythonize on your Extension objects to make it a bit
smarter.

-Chris




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-31 Thread Chris Barker
On Thu, Oct 31, 2013 at 2:34 PM, Nick Coghlan ncogh...@gmail.com wrote:

  For all platforms *except* Windows, wheels are essentially caches --
  there is no real reason to distribute them via PyPI at all, because OSx
  and Linux develpoers will have tools to build them from sdists.

That's not at all true -- it IS true of homebrew, etc users, but not the
least bit true of the genreral Mac user:

* Installing XCode is free, but not default, and less than trivial, and
even less than trivial to get right to build python extensions.

* Many packages require third party compiled libs -- even harder to do on
the Mac -- some are a downright pain in the ^%*.

What if an OSX user wants to install numpy/scipy? How easy is it to do this
 from source (I really don't know)?


A serious pain in the %^$ -- numpy is pretty easy, but scipy is a
nightmare, requiring Fortran, etc. The community has addressed with with
scientific python distributions: Anaconda, Canopy, Python(x,y), etc. But
is sure would be nice to have binary wheels on PyPi

And the cache thing is really nice, actually.

 Given PEP 453, it's probably worth allowing wheels on Mac OS X in pip 1.5,
 then we can tackle the thornier general *nix problem in the pip 1.6 time
 frame (which should also improve external dependency handling on Windows
 and Mac OS X as well).

Sounds great!



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-10-31 Thread Chris Barker
On Thu, Oct 31, 2013 at 9:49 AM, Daniel Holth dho...@gmail.com wrote:

 I'm sure you could build your own broken Windows Python, but who
  bothers?


As long as we are clear that we are talking about a social difference here,
not a technical one...

IMO it pretty much boils down to the fact that on Windows you
 are probably using the python.org version of Python and not linking
 with random shared libraries from C:\Program Files\, but on Linux you
 are probably using one of a few dozen distro x distro-release Pythons
 AND your extension probably dynamically links to some other things
 your specific distro provides AND maybe you are depending on some
 versioned symbols in glibc oh the horror.

 On OS X I would hope you are uploading only wheels built with
 python.org-Python but maybe you aren't, everyone has their
 preferences.


yes, they do -- but what is the target audience here? yes, a lot of folks
use macports, homebrew etc, fewer, but I'm sure some, build their own
Python from scratch -- but these are NOT the people that we want binary
wheels for -- they don't want them anyway.

The folks that we want to provide binary wheels for are NOT going to be
building their own esoteric python, really, they are not.

The MacPython community has a long standing tradition of building binaries
(if at all) that are compatible with the python.org builds (and secondarily
with the Apple-supplied Python) -- that is what I'd like to see supported
by PyPi -- just like Windows

Sure, someone could upload some oddly-built binary wheel to PyPi -- then it
would not work for most users, and they would get complaints and hopefully
fix it -- just like uploading a package with any other kind of bug in it.

It is kind of a pain to build a truly portable binary package (when it
depends on third-party compiled libs), but there is a small
but committed group of folks doing that already -- let's make it easier to
get stuff out there.

 Will a C extension built with Homebrew Python work with the Python Mac
  OS X installer from python.org? Probably, but given how painful ABI
  mismatches with shared libraries can be to debug, probably doesn't
  cut it until someone has the chance to thoroughly review the potential
  for problems.


I disagree:

1) I don't care if homebrew built extensions work with other pythons -- you
want to build with homebrew, create a homebrew recipe. -- there should be a
policy about how binary packages posted on PyPi should be built.

2) We're never going to find out what the problems are until we give it a
try.

Fundamentally, I disagree with the premise here: If we
can't guarantee that anything anyone uploads will work for everyone, we
shouldn't allow it -- that's an unattainable goal.

If we do want a more fool-proof approach, then the name auto-generated by
wheel should include something that means python.org-build only if built
with the python.org build.

And I suppose we could try to put check code in there to make sure that
extensions aren't linked to outside libs. Actually, that would be a handy
utility to have, even if it didn't enforce anything. (and by the way, it's
rea;lly easy to build a binary for Windows that's linked to an external dll
also -- we expect package builders to be careful with that...)


I was told a wheel built on Ubuntu probably won’t work on Linux, so I shut
 off Linux Wheels, at the same time I asked about Windows and OSX wheels,
 the answers I got from people were they would generally just work on
 Windows, and nobody gave me a straight answer on OSX.


Sorry we weren't out there answering!

Linux is a different story -- not only are there a lot of variations out
there, but there also is no obvious standard one could point to that we'd
expect folks to build binary wheels for.

OS-X has (to much) variety  though it is less than Linux, and more to the
point, there is a standard Python out there -- the python.org builds. And
there is a tradition of building binaries for that standard. AFAIK, it is
pretty much the ONLY build of Python that package maintainers support with
binaries (if they support anything).

 If you build a Wheel with Homebrew Python will it work on the official
 OSX installers? What if I have a library installed from Homebrew?


probably not, but again, I don't care -- that's not what binary wheels on
Python would be for. And more to the point -- this is a policy question --
don't upload a binary wheel to pypi that depends on homebrew (or anything
else that Apple didn't provide)

Essentially trying to figure out how likely it is that with the existing
 tags a wheel is going to work if we select based on them.-Chris


One thing I'm not clear on -- if you do :

pip install something

will pip preferentially select a binary wheel (if enabled on pypi?) -- that
may be an issue as folks will surely try to pip install stuff with
homebrew, macports, etc. pythons (though the wheels are more likely to work
in that direction.

-Chris



-- 

Christopher Barker, Ph.D.

Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-11-01 Thread Chris Barker
On Fri, Nov 1, 2013 at 6:59 AM, Paul Moore p.f.mo...@gmail.com wrote:

 The key point here is the granularity of the PEP 425 tags used by wheel.

 The risk is that a wheel created on another system might declare (via
 its filename) that it is compatible with your system, and then not be,
 causing segfaults or similar when used.


indeed, which is why it _might_ be a good idea to include an extra python
build flag or something: python.org, homebrew, macports. However,
it's probably the case that those aren't really the issues that are going
to cause problems -- at least ones that aren't already handled by the OS-X
version flags --- i.e if a package is built for 10.6+, then it should have
the same system libs as a python built for 10.6+.

Practically speaking, the issues I've run into are:

* Packages built for a newer OS-X won't work on an older one -- but that
should be handled by the OS-X version handling that exists.

* universal binaries -- packages built for 32 bit aren't going to work
with a 64 bit python, and a universal python can be both 32 and 64 bit (and
PPC, but I think those days are gone...) -- but this _should_ be handled by
the platform flag: IIRC, intel means 32+64 bit Intel. Though I'm not sure
what homebrew or macports python report. But distutils generally does the
right thing with self-contained C code.

* External dependencies: This is the BIG ONE: it's the hardest to get
right, and the hardest to check for. Third party libs must:
  - Be built to match the python, including SDK and architecture (including
universal)
  - Be included somehow -- ideally statically linked, but I'm thinking that
they could be included as part of another dependent package (I think that's
how Anocanda does it). The trick with dynamic linking on OS-X is that that
standard way to install and link a lib has the path to the lib hard-coded
in -- so you can't move it without re-writing the headers. This can be done
on install, but I don't hink we want pip to have to deal with that! You
_can_ install and link libs with relative paths, which I think is what
Anaconda is doing, but I haven't figured out how yet, and it's certainly
not a no-brainer.

So I don't think there is any way to get around the fact that you need to
be careful to build a binary wheel that will work on the systems you
are targeting -- but this is no different than the situation we've had for
years with building binary installers for the Mac. But those dont work with
pip, or virtualenv, or...


 On Windows, thanks to the
 history of bdist_wininst, and the small number of people who build
 their own *anything* on Windows, there is really only one ABI/C
 Library/whatever to worry about and that is the python.org one.
 (Actually, there are two - 32 and 64 bit).


Well, technically, the situation is very similar -- it's hard to build a
Windows binary (at least if it has external dependencies), so that it will
just work.

Socially, the situation is different -- there are a (relatively) small
number of people building their own stuff. On the Mac, however, you have
homebrew and macports, and ??, so lots of people building their own stuff.
But those aren't the people we need to support with binaries!

Is anyone expecting a binary built for Windows to work with a cygwin python?
Is anyone expecting that they can build a binary on Windows with cygwin and
give it out? That's what we're talking about here with the Mac.

thanks to the history of bdist_wininst ..


The mac has a history of bdist_mpkg as well, not as widely used, and a bit
neglected lately, but it's there. And there is a history of folks providing
binary installers for the python.org mac. But it would be really nice if we
could go to wheels, and use pypi to distribute them.

It really is the same as Windows -- anyone putting a binary on PyPi has an
obligation to built is so it will work with the python.org python -- and
it's not inherently any harder to do that than on Windows -- the only
difference is that it may be easier to do it badly -- by simply running
bdist_wheel without thinking about it (i.e with homebrew, or macports and
whatever shared libs happen to be linked to).

But again, that's a social problem -- we need to have an understanding
about what is required to put a binary wheel up on pypi.

Also, where we _could_ have a way to identify python.org, vs homebrew, vs.
macports as different platfroms, that's not going to help the hard
problem, which is making sure third party libs are built and included
properly.


 If all builds on OSX are compatible at the ABI/CLib/architecture level
 then there should be no problem. Equally, if incompatible builds
 result in wheels with different compatibility tags, also no problem.
 It's only if 2 incompatible environments use the same tags that there
 could be an issue.


yeah -- but the third party libs are the bigger issue anyway...


 I don't believe that linking with external libraries should be a
 problem here - if wheel X depends on library Y, 

Re: [Distutils] Plans for binary wheels, and PyPi and OS-X

2013-11-01 Thread Chris Barker
On Fri, Nov 1, 2013 at 5:45 PM, Nick Coghlan ncogh...@gmail.com wrote:

 * the key relevant points about users on Windows and Mac OS X are that
 most (perhaps only many on Mac OS X) tutorials and introductory courses
 will direct them to the binary installers on python.org, and such users
 are highly unlikely to have a C compiler installed, so their current out
 of the box user experience with pip is that it doesn't work for even the
 simplest of C extensions.


Thank you for being so articulate about that -- Ive been unsuccesfully
trying to say this this whole thread 

Note also that it's not just what tutorials say, it's what the _should_
say. WE really wouldn't want to say to new users:


Want to learn to program in Python? First, install a compiler, which, by
the way is a multi-GB download from Apple, that you have to register as a
developer to get.

Though Ill also add that binaries for the python.org builds also support
users that may have  compiler, but not the expertise to build third-party
libs, and build re-distributable binaries for older OS versions, etc.


 * by contrast, in other *nix environments (including cygwin on Windows and
 homebrew etc on Mac OS X), using the system/environment Python is far more
 common, and a C compiler is far more likely to be available

indeed, required, for homebrew and macports (and cygwin?)


 * accordingly, the following defaults make sense for pip 1.5:
 - allow wheel files from the index server for Windows and Mac OS X
 - allow local wheel files everywhere

 sounds good. -- and have a stated policy )or at least recommendation) that
binary wheels for OS-X be built for the python.org pythons.

* the following options should also be available:
 - force use of wheel files from the index server (useful for private index
 servers)
 - prevent use of wheel files from the index server (useful to force local
 builds on Windows and Mac OS X)
 - prevent use of wheel files (useful to force a local rebuild, overwriting
 the wheel cache)

sounds good.

One question: should pip be able to install a incompatible binary wheel
directly without even a warning? It does now, but I don't think it should.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] pip feedback to user...

2013-11-18 Thread Chris Barker
Is this the right place to discuss UX issues for pip? If not, point to the
right place, if so, read on:

I think pip's usability could be much improved with a little tweaking to
the messages it prints to the console when it does its thing. For instance,
when I do a:

pip install some_package

I get the message:

Downloading/unpacking some_package

When in fact, pip is not (yet) doing that -- what it is doing is looking
for the package on pypi (or ???) Which is fine if the package is found, but
if it's not, you then get some confusing messages, like:

Could not find any downloads that satisfy the requirement some-package
Cleaning up...

not to bad -- it does say it wasn't found, but it seems it could be a bit
odd to a new user -- but I thought it was being downloaded?

I was just trying to upgrade a packe that I was told could now be pip
installed:

$ pip install --upgrade tracpy
Could not find any downloads that satisfy the requirement tracpy in
/Users/chris.barker/PythonStuff/TracPy/tracpy
Downloading/unpacking tracpy
Cleaning up...
No distributions at all found for tracpy in
/Users/chris.barker/PythonStuff/TracPy/tracpy
Storing complete log in /Users/chris.barker/.pip/pip.log

It turns out that they have not (yet?) registered it with pypi (and I have
a n older copy in /Users/chris.barker/PythonStuff/TracPy/tracpy). But it
does seem weird and confusing that it first says it can't find any
downloads, then tells me its downloading, then cleaning up, then says it
couldn't find anything...


Anyway, small stuff, but these little things make a difference.

-Chris





-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-02 Thread Chris Barker
On Mon, Dec 2, 2013 at 5:22 AM, Nick Coghlan ncogh...@gmail.com wrote:

 And the conda folks are working on playing nice with virtualenv - I don't
 we'll see a similar offer from Microsoft for MSI any time soon :)

nice to know...

   a single organisation. Pip (when used normally) communicates with PyPI
   and no single organisation controls the content of PyPI.

can't you point pip to a wheelhouse'? How is that different?

For built distributions they could do
   the same - except that pip/PyPI don't provide a mechanism for them to
   do so.

I'm still confused as to what conda provides here -- as near as I can tell,
conda has a nice hash-based way to ensure binary compatibility -- which is
a good thing. But the curated set of packages is an independent issue.
What's stopping anyone from creating a nice curated set of packages with
binary wheels (like the Gohlke repo)

And wouldn't it be better to make wheel a bit more robust in this regard
than add yet another recommended tool to the mix?

 Exactly, this is the difference between pip and conda - conda is a
 solution for installing from curated *collections* of packages. It's
 somewhat related to the tagging system people are speculating about for
 PyPI, but instead of being purely hypothetical, it already exists.

Does it? I only know of one repository of conda packages -- and it provides
poor support for some things (like wxPython -- does it support any desktop
GUI on OS-X?)

So why do we think that conda is a better option for these unknown curatied
repos?

Also, I'm not sure I WANT anymore curated repos -- I'd rather a standard
set by python.org that individual package maintainers can choose to support.

PyPI wheels would then be about publishing default versions of
 components, with the broadest compatibility, while conda would be a
 solution for getting access to alternate builds that may be faster, but
 require external shared dependencies.

I'm still confused as to why packages need to share external dependencies
(though I can see why it's nice...) .

But what's the new policy here? Anaconda and Canopy exist already? Do we
need to endorse them? Why? If you want PyPI wheels would then be about
publishing default versions of components, with the broadest
compatibility, -- then we still need to improve things a bit, but we can't
say we're done

What Christoph is doing is producing a cross-platform curated binary
 software stack, including external dependencies. That's precisely the
 problem I'm suggesting we *not* try to solve in the core tools any time
 soon, but instead support bootstrapping conda to solve the problem at a
 different layer.

So we are advocating that others, like Christoph, create curated stack with
conda? Asside from whether conda really provides much more than wheel to
support doing this, I think it's a BAD idea to encourage it: I'd much
rather encourage package maintainers to build standard packages, so we
can get some extra interoperabilty.

Example: you can't use wxPython with Anocoda (on the Mac, anyway). At least
not without figuring out how to build it yourself, an I'm not sure it will
even work then. (and it is a fricking nightmare to build). But it's getting
harder to find standard packages for the mac for the SciPy stack, so
people are really stuck.

So the pip compatible builds for those tools would likely miss out on some
 of the external acceleration features,

that's fine -- but we still need those pip compatible builds 

and the nice thing about pip-compatible builds (really
python.orgcompatible builds...) is that they play well with the other
binary
installers --

 By ceding the distribution of cross-platform curated software stacks with
 external binary dependencies problem to conda, users would get a solution
 to that problem that they can use *now*,

Well, to be fair, I've been starting a project to provide binaries for
various packages for OS_X amd did intend to give conda a good look-see, but
I really has hoped that wheels where the way now...oh well.
-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-03 Thread Chris Barker
Side note about naming:

I'm no expert, but I'm pretty sure Anoconda is a python distribution --
python itself and set of pre-build packages. conda is the package manager
that is used by Anoconda -- kind of like rpm is used by RedHat -- conda is
an open-source project, and thus could be used by any of us completely
apart from the Anoconda distribution.


On Sun, Dec 1, 2013 at 3:38 PM, Paul Moore p.f.mo...@gmail.com wrote:

  had to resort to Google to try to figure out what dev libraries I needed.

 But that's a *build* issue, surely? How does that relate to installing
 Nikola from a set of binary wheels?

 Exactly -- I've mostly dealt with this for OS-X -- there are a cadre of
users that want binaries, and want them to just work -- we've had mpkg
packages for a good while, analogous to Windows installers. Binary eggs
never worked quite right, 'cause setuptools didn't understand universal
binaries -- but it wasn't that far from working. Not really tested much
yet, but it ;looks like binary wheels should be just fine. The concern
there is that someone will be running, say, a homebrew-built python, and
accidentally install a binary wheel built for the python.org python -- we
should address that with better platform tags (and making sure pip at least
give a warning if you try to install an incompatible wheel...)

So what problem are we trying to solve here?

1) It's still a pain to actually build the packages -- similarly to
Windows, you really need to build the dependent libraries statically and
link them in - and you need to make sure that you build them with teh right
sdk, and universally -- this is hard to do right.
  - does Conda help you do any of that???

2) non-python binary dependencies: As it turns out, a number of python
packages depend on the same third-party non-python dependencies: I
have quite a few that use libpng, libfreetype, libhdf, ??? currently if you
want to distribute binary python packages, you need to statically link or
supply the dlls, so we end up with multiple coples of the same lib -- is
this a problem? Maybe not -- memory is pretty cheap these days, and maybe
different packages actually rely on different versions of the dependencies
-- this way, at least the package builder controls that.

Anoconda (the distribution  seems to address this by having conda packages
that are essentially containers for the shared libs, and other packages
that need those libs depend on them. I like this method, but it seems to me
to be more a feature of the Anoconda distribution than the conda package
manager -- in fact, I've been thinking of doing this exact same thing with
binary wheels -- I haven't tried it yet, but don't see why it wouldn't work.

I understand you are thinking about non-Python libraries, but all I
 can say is that this has *never* been an issue to my knowledge in the
 Windows world.


yes, it's a HUGE issue in the Windows world -- in fact such a huge issue
that almost non one ever tries to build things themselves, or build a
different python distro -- so, in fact, when someone does make a binary,
it's pretty likely to work. But those binaries are a major pain to build!

(by the way, over on python-dev there has been a recent discussion about
stackless building a new python2.7 windows binary with a newer MS compiler
-- which will then create exacty these issues...)

 Outside the scientific space, crypto libraries are also notoriously hard
 to
  build, as are game engines and GUI toolkits. (I guess database bindings
  could also be a problem in some cases)

 Build issues again...


Yes, major ones.

(another side note: you can't get wxPython for OS-X to work with Anoconda
-- there is no conda binary package, and python itself is not built in a
way that it can access the window manager ... so no, this stuff in NOT
suddenly easier with conda.)

Again, can we please be clear here? On Windows, there is no issue that
 I am aware of. Wheels solve the binary distribution issue fine in that
 environment


They will if/when we make sure that the wheel contains meta-data about what
compiler (really run-time version) was used for the python build and wheel
build -- but we should, indeed, do that.

 This is why I suspect there will be a better near term effort/reward
  trade-off in helping the conda folks improve the usability of their
 platform
  than there is in trying to expand the wheel format to cover arbitrary
 binary
  dependencies.


and have yet anoto=her way to do it? AARRG! I'm also absolutely unclear on
what conda offers that isn't quite easy to address with binary wheels. And
it seems to need help too, so it will play better with virtualenv

If conda really is a better solution, then I suppose we could
go deprecate wheel before it gets too much traction...;-) But let's
please not another one to the mix to confuse people.

Excuse me if I'm feeling a bit negative towards this announcement.
 I've spent many months working on, and promoting, the wheel + pip
 solution, to the 

Re: [Distutils] Handling the binary dependency management problem

2013-12-03 Thread Chris Barker
On Tue, Dec 3, 2013 at 12:48 AM, Nick Coghlan ncogh...@gmail.com wrote:

 Because it already works for the scientific stack, and if we don't provide
 any explicit messaging around where conda fits into the distribution
 picture, users are going to remain confused about it for a long time.

Do we have to have explicit messaging for every useful third-party package
out there?

 I'm still confused as to why packages need to share external dependencies
 (though I can see why it's nice...) .

 Because they reference shared external data, communicate through shared
 memory, or otherwise need compatible memory layouts. It's exactly the same
 reason all C extensions need to be using the same C runtime as CPython on
 Windows: because things like file descriptors break if they don't.


OK -- maybe we need a better term than shared external dependencies -- that
makes me think shared library. Also even the scipy stack is not as
dependent in build env as we seem to thin it is -- I don't think there is
any reason you can't use the standard MPL with Golke's MKL-build numpy,
for instance. And Im pretty sure that even scipy and numpy don't need to
share their build environment more than any other  extension (i.e. they
could use different BLAS implementations, etc... numpy version matters, but
that's handled by the usual dependency handling.

The reason Gohke's repo, and Anoconda and Canopy all exist is because it's
a pain to build some of this stuff, period, not complex compatibly issues
-- and the real pain goes beyond the standard scipy stack (VTK is a killer!)

 Conda solves a specific problem for the scientific community,

well, we are getting Anaconda, the distribution, and conda, the package
manager, conflated here:

Having a nice full distribution of all the packages you are likely to need
to great, but you could so that with wheels, and Gohlke is already doing it
with MSIs (which don't handle dependencies at all -- whic is a problem).


 but in their enthusiasm, the developers are pitching it as a general
 purpose packaging solution. It isn't,


It's not? Aside from momentum, and all that, could it not be a replacement
for pip and wheel?


 Wheels *are* the way if one or both of the following conditions hold:

 - you don't need to deal with build variants
 - you're building for a specific target environment

 That covers an awful lot of ground, but there's one thing it definitely
 doesn't cover: distributing multiple versions of NumPy built with different
 options and cohesive ecosystems on top of that.


hmm -- I'm not sure, you could have an Anoconda-like repo built with
wheels, could you not? granted, it would be easier to make a mistake, and
pull wheels from two different wheelhouses that are incompatible, so there
is a real advantage to conda there.

 By contrast, conda already exists, and already works, as it was designed
 *specifically* to handle the scientific Python stack.

I'm not sure we how well it works -- it works for Anoconda, and good point
about the scientifc stack -- does it work equally well for other stacks? or
mixing and matching?

  This means that one key reason I want to recommend it for the cases
 where it is a good fit (i.e. the scientific Python stack) is so we can
 explicitly advise *against* using it in other cases where it will just add
 complexity without adding value.

I'm actually pretty concerned about this: lately the scipy community has
defined a core scipy stack:

http://www.scipy.org/stackspec.html

Along with this is a push to encourage users to just go with a scipy
distribution to get that stack:

http://www.scipy.org/install.html

and

http://ipython.org/install.html

I think this is in response to a years of pain of each package trying to
build binaries for various platforms, and keeping it all in sync, etc. I
feel their pain, and just go with Anaconda or Canopy is good advise for
folks who want to get the stack up and running as easily as possible.

But it does not server everyone else well -- web developers that need MPL
for some plotting , scientific users that need a desktop GUI toolkit,
pyhton newbies that want iPython, but none of that other stuff...

What would serve all those folks well is a standard build of packages --
i.e. built to go with the python.org builds, that can be downloaded with:

pip install the_package.

And I think, with binary wheels, we have the tools to do that.

 Saying nothing is not an option, since people are already confused. Saying
 to never use it isn't an option either, since bootstrapping conda first
 *is* a substantially simpler cross-platform way to get up to date
 scientific Python software on to your system.

again, it is Anoconda that helps here, not conda itself.

  Or
 how about a scientist that wants wxPython (to use Chris' example)?
 Apparently the conda repo doesn't include wxPython, so do they need to
 learn how to install pip into a conda environment? (Note that there's
 no wxPython wheel, so this isn't a good example yet, 

Re: [Distutils] Handling the binary dependency management problem

2013-12-04 Thread Chris Barker
On Wed, Dec 4, 2013 at 12:56 PM, Ralf Gommers ralf.gomm...@gmail.comwrote:

 The problem is explaining to people what they want - no one reads docs
 before grabbing a binary.


right -- so we want a default pip install install that will work for most
people. And I think works for most people is far more important than
optimized for your system

 How many non-sse machines are there still out there? How many non-sse2?


 Hard to tell. Probably 2%, but that's still too much.


I have no idea how to tell, but I agree 2% is too much, however, 0.2% would
not be too much (IMHO) -- anyway, I'm just wondering how much we are making
this hard for very little return.

Anyway, best would be a select-at-runtime option -- I think that's what MKL
does. IF someone can figure that out, great, but I still think a numpy
wheel that works for most would still be worth doing ,and we can do it now.


Some older Athlon XPs don't have it for example. And what if someone
 submits performance optimizations (there has been a focus on those
 recently) to numpy that use SSE4 or AVX for example? You don't want to
 reject those based on the limitations of your distribution process.


No, but we also don't want to distribute nothing because we can't
distribute the best thing.

 And how big is the performance boost anyway?


 Large. For a long time we've put a non-SSE installer for numpy on pypi so
 that people would stop complaining that ``easy_install numpy`` didn't work.
 Then there were regular complaints about dot products being an order of
 magnitude slower than Matlab or R.


Does SSE by you that? or do you need a good BLAS? But same point, anyway.
Though  I think we lose more users by people not getting an install at all
then we lose by people installing and then finding out they need a to
install an optimized version to a get a good dot.



 Yes, 64-bit MinGW + gfortran doesn't yet work (no place to install dlls
 from the binary, long story). A few people including David C are working on
 this issue right now. Visual Studio + Intel Fortran would work, but going
 with only an expensive toolset like that is kind of a no-go -


too bad there is no MS-fortran-express...

On the other hand, saying no one can have a 64 bit scipy, because people
that want to build fortran extensions that are compatible with it are out
of luck is less than ideal. Right now, we are giving the majority of
potential scipy users nothing for Win64.

You know what they say done is better than perfect

[Side note: scipy really shouldn't be a monolithic package with everything
and the kitchen sink in it -- this would all be a lot easier if it was a
namespace package and people could get the non-Fortran stuff by
itself...but I digress.]

 Note on OS-X :  how long has it been since Apple shipped a 32 bit machine?
 Can we dump default 32 bit support? I'm pretty sure we don't need to do PPC
 anymore...


 I'd like to, but we decided to ship the exact same set of binaries as
 python.org - which means compiling on OS X 10.5/10.6 and including PPC +
 32-bit Intel.


no it doesn't -- if we decide not to ship the 3.9, PPC + 32-bit Intel.
binary -- why should that mean that we can't ship the Intel32+64 bit one?

And as for that -- if someone gets a binary with only 64 bit in it, it will
run fine with the 32+64 bit build, as long as it's run on a 64 bit machine.
So if, in fact, no one has a 32 bit Mac anymore (I'm not saying that's the
case) we don't need to build for it.

And maybe the next python.org builds could be 64 bit Intel only. Probably
not yet, but we shouldn't be locked in forever

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-05 Thread Chris Barker
On Thu, Dec 5, 2013 at 5:52 PM, Donald Stufft don...@stufft.io wrote:


 On Dec 5, 2013, at 8:48 PM, Chris Barker - NOAA Federal 
 chris.bar...@noaa.gov wrote:

  What would really be best is run-time selection of the appropriate lib
  -- it would solve this problem, and allow users to re-distribute
  working binaries via py2exe, etc. And not require opening a security
  hole in wheels...
 
  Not sure how hard that would be to do, though.

 Install time selectors probably isn’t a huge deal as long as there’s a way
 to force a particular variant to install and to disable the executing code.


I was proposing run-time -- so the same package would work right when
moved to another machine via py2exe, etc. I imagine that's harder,
particularly with permissions issues...

-Chris







 -
 Donald Stufft
 PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372
 DCFA




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-06 Thread Chris Barker
On Thu, Dec 5, 2013 at 11:21 PM, Ralf Gommers ralf.gomm...@gmail.comwrote:

 Hmm, taking a compile flag and encoding it in the package layout seems
 like a fundamentally wrong approach.


well, it's pretty ugly hack, but sometimes an ugly hack that does the job
is better than nothing.

IIUC, the Intel MKL libs do some sort of dynamic switching at run time too
-- and that is a great feature.



 And in order to not litter the source tree and all installs with lots of
 empty dirs,


where lots what, 3? Is that so bad in a project the size of numpy?

 the changes to __init__.py will have to be made at build time based on
 whether you're building Windows binaries or something else.


That might in fact be nicer than the litter, but also may be a less
robust and more annoying way to do it.



 Path manipulation is usually fragile as well.


My first instinct was that you'd re-name directories on the
fly, which might be more robust, but wouldn't work in any kind of
secure environment. so a no-go.

But could you elaborate on the fragile nature of sys.path manipulation?
What might go wrong there?

Also, it's not out of the question that once such a system was in place,
that it could be used on systems other than Windows

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Binary dependency management, round 2 :)

2013-12-06 Thread Chris Barker
On Fri, Dec 6, 2013 at 6:22 AM, Nick Coghlan ncogh...@gmail.com wrote:

  I created a draft of this new section at
  
 https://bitbucket.org/pypa/python-packaging-user-guide/pull-request/12/recommendations-for-numpy-et-al/diff


looks good, thanks!

ONe note:


In particular, bootstrapping conda via ``pip install conda`` and then
running the ``conda init`` command provides access to all of the pre-built
binaries that Continuum Analytics have created for the free version of
the Anaconda distribution.
 

I've been chatting off list with Travis, and while it does appear that
Anaconda is more compatibly with the python.org installers than I had
thought, there are still some pretty significant rough edges,
particularly with the OS-X build. But it does look pretty promising.

Travis pointed out that I had been pretty critical of endorsing conda in
this thread. He is right. That came from two things:

1) years of frustration with the mess of python packaging, that I thought
we were finally resolving with binary wheels.

2) recent bad experience with Anaconda and teaching python to newbies.

I'm pretty sure we all want  one way to do it -- i.e. we can just tell
people to install a python.org build, then  use pip install to
get everything else, than may simply not be practical, and apparently we're
not as close as I thought.

But as I think back over the last few years, I realize that I've been
recommending the python.org binaries across the board, because it is the
basis of a number of different package approaches. So I can tell my
students:

 - try pip  install

 - if that doesn't work, look in the Gohlke repo

 - if that doesn't work, look for a binary on the package web page.

 - if that doesn't work, follow the build instructions on the package web
page

And while it would be great if pip install always worked, this really isn't
so bad, and it's not so bad because, in fact, most of the maintainers of
complex packages have been targeting python.org binaries for years.

So if we can't have wheels for everything, we can do pretty well if the
other options (e.g. conda)  are python.org compatible. That's not quite the
case for Anaconda now, but it's not that far off, and Travis is
interested in making it better.

I'm not going to have time to poke at this myself for at least a few weeks,
but at some point, maybe we can, for instance, try to convert the wxPython
binaries to a conda package, or just figure out a post-install hack that
will let it work with Anaconda (the current wxPython binaries for OS-X
already have *.pth file magic to let both the python.org and Apple binaries
use the same package).

So nice work all -- it seem this is not getting as ugly as I feared.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-06 Thread Chris Barker
On Fri, Dec 6, 2013 at 4:33 AM, Nick Coghlan ncogh...@gmail.com wrote:

 In the absence of the perfect solution (i.e. picking the right variant
 out of no SSE, SSE2, SSE3 automatically), would it be a reasonable
 compromise to standardise on SSE2 as lowest acceptable common
 denominator?


+1


 Users with no sse capability at all or that wanted to take advantage
 of the SSE3 optimisations, would need to grab one of the Windows
 installers or something from conda, but for a lot of users, a pip
 install numpy that dropped the SSE2 version onto their system would
 be just fine, and a much lower barrier to entry than well, first
 install this other packaging system that doesn't interoperate with
 your OS package manager at all


exactly -- for example, I work with a web dev that could really use
Matplotlib for a little task -- if I could tell him to pip install
matplotlib, he's do it, but he just sees it as too much hassle at the
point...



 Are we letting perfect be the enemy of better, here?


I think so, yes.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-06 Thread Chris Barker
On Fri, Dec 6, 2013 at 5:06 AM, David Cournapeau courn...@gmail.com wrote:

 As Ralf, I think it is overkill. The problem of SSE vs non SSE is because
 of one library, ATLAS, which as IMO the design flaw of being arch specific.


yup -- really designed for the end user to built it themselves


 MKL does not have this issue, and now that openblas (under a BSD license)
 can be used as well, we can alleviate this for deployment. Building a
 deployment story for this is not justified.


So Openblas has run-time selection of the right binary? very cool! So are
we done here?

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Handling the binary dependency management problem

2013-12-06 Thread Chris Barker
On Fri, Dec 6, 2013 at 5:16 AM, Thomas Heller thel...@ctypes.org wrote:

 Am 06.12.2013 13:22, schrieb Nick Coghlan:



 Manipulation of __path__ at runtime usually makes it harder for

 modulefinder to find all the required modules.


 Not usually, always. That's why
 http://docs.python.org/2/library/modulefinder#modulefinder.AddPackagePath
 exists :)


 Well, as the py2exe author and the (inactive, I admit) modulefinder
 maintainer I already know this.


modulefinder fails often enough that Ive never been able ot package a
non-trivial app without a bit of force-include all of this package, (and
don't-include this other thing!). So while too bad, this should not be
considered  deal breaker.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Install a script to prefix/sbin instead of prefix/bin

2013-12-07 Thread Chris Barker
Just a note here:

the wxWidgets (and thus wxPython, natch) project has a wxStandardPaths
object:

http://docs.wxwidgets.org/trunk/classwx_standard_paths.html

It provides a cross platform way to get, well, the standard paths an
application might need:

GetAppDocumentsDir ()
GetConfigDir ()
GetDataDir ()
GetDocumentsDir ()
GetExecutablePath ()
GetInstallPrefix ()
GetLocalDataDir ()
GetLocalizedResourcesDir ()
GetPluginsDir ()
GetResourcesDir ()
GetTempDir ()
GetUserConfigDir ()
GetUserDataDir ()
GetUserLocalDataDir ()
.

These all do the right thing on the supported platforms -- something may
be in the application install dir on Windows, int he app bundle on the MAc,
and it /etc on Linux, for instance.

granted, wx is an application framework, so it needs this stuff, but while
pip is about installing python packages, there really is no clear line
between at package with a script or two and an application, and even simple
scripts may need a few of these things. I don't see why we couldn't include
a set like this, and have platform-specific mappings.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Platform specific destinations in wheel files?

2013-12-23 Thread Chris Barker
On Sat, Dec 21, 2013 at 2:57 AM, Nick Coghlan ncogh...@gmail.com wrote:

 compliant daemon like cobblerd as a wheel file - using Python specific
 formats to define the layout of full applications, not just libraries.



 I'd generally been resisting the idea of supporting this (since I
 favour interoperating with system packaging tools where appropriate
 over attempting to replace them entirely), but in this case I'm
 starting to think it may be necessary to support these layouts in the
 next version of the wheel format in order to *also* support automated
 conversion of upstream projects to policy compliant system packages.


hmm I tend to think, like you, that this isn't a problem wheel should
solve, but can also see the advantages...for the moment though talking
about how it would solve it may help clarify whether it should.

adding a new name.app subdirectory in parallel.

 A wheel that had content in app would be inherently platform
 specific - you wouldn't be permitted to use the new directory in a
 cross-platform wheel file. The defined subdirectories of app would
 also be platform specific.


is this necessary -- couldn't there be a way to provide the info in a
cross-platform way, and have it mapped to the platform specific locations
at install-time?



 All POSIX systems would at least support the fhs subdirectory. For a
 system install, this would map to /, for a virtualenv install it
 would map to the root of the virtualenv and for a user install it
 would map to ~/.local.


then you explicitly put in bin, sbin, share, whatever?

This seems really klunky to me, and also forces platform dependence, and is
fundamentally tied to how posix does things

Maybe it's not possible, but I suggest that we could pre-define the
locations that might be needed:

executables  (bin)
system_executables  (sbin)
user_executables  (./bin)
documentation (doc)
system_data_files (share ?)
user_data_files (./share )
app_configuration (/etc/appname)
user_app_configuration (./.app_name : ./Library )



This could end up being a pretty big list, but I think it could be finite.

Then at install-time, the installer maps these to the appropriate place on
the system.

It's a little different application, but wxWidgets does this pretty
successfully with wxStandardPaths.

-Chris













 I'm not sure what other subdirectories would be appropriate for
 Windows or Mac OS X, although I expect being able to install into
 Program Files and Application Data would be interesting for Windows
 apps, and into an application folder for Mac OS X.

 It's really the potential for FHS support that drives my interest in
 the idea, but if we're going to do something like this at all, it
 shouldn't be POSIX specific.

 Cheers,
 Nick.

 --
 Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Platform specific destinations in wheel files?

2013-12-24 Thread Chris Barker
On Mon, Dec 23, 2013 at 1:43 PM, Daniel Holth  Agreed. My biggest concern
with this whole idea is that developers

   (typically POSIX developers, but it applies equally to all) will
  *think* they need something like sbin because they are used to the
  concept from their environment, and so write their wheel to use it and
  hence be platform specific.


exactly.

However, with a little thought (possibly
  hardly any thought in the case of sbin :-)) they could have chosen a
  more generic approach which makes their project available to users of
  other platforms.


right, but setting the system up to allow a prefix and then hard-specify
paths under that makes it impossible to do it cross-platform!

 Portable by default should be the principle.


+1 ... and at least possible (without writing multiple platform-specific
versions...)!

ANother thought is:

who should control where things are put?

a) the package developer?
b) the python system maintainer?

I think clearly the answer is (b). i.e. we'll have reasonable defaults in
the python.org builds, but if Anaconda or Canopy, or a BSD or a linux
distro, or Raspberry Pi, or Micro. could all define their own paths for
standard locations. I think this is much better than the package
maintainers fixing that.

The GNU autoconf paths are the obvious choice.


Certainly a good palce to start.

One issue -- that (and the FHS) is built around the idea of a prefix,
where you can shift the whole pile to /usr or /usr/local, or /opt, or???

But that concept doesn't work on all platforms, so we should be careful
about isolating it.

-Chris



 It would be really easy
 to try adding these to the dict that wheel installers use to relocate
 stuff.
 http://www.gnu.org/prep/standards/html_node/Directory-Variables.html




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Platform specific destinations in wheel files?

2013-12-27 Thread Chris Barker
On Tue, Dec 24, 2013 at 2:28 PM, Nick Coghlan ncogh...@gmail.com wrote:

  But that concept doesn't work on all platforms, so we should be careful
 about isolating it.

 Encapsulating that assumption is why I think the gnu nesting is
 justified. There are layout expectations inherent in the autoconf directory
 design that just don't make sense on Windows, so any package expecting them
 is going to be a bit quirky if installed on Windows.

I'm confused now as to what has been proposed or being discussed, or...

I _thought_ that this thread started with a proposal that package authors
would do something like specifying a file hierarchy for the stuff they are
delivering:

/bin/some_scripts
/share/some_data
/man/some_docs


then at installation time, the python distro would decide where to copy all
that.

But this would beb worthless on non-gnu systems and would require:
 1) package writer to write three or four versions
 2) wheels to be platform-dependent unnecessarily.

So my suggestion was to define the various locations where stuff may need
to be installed at a higher level:

place_to_put_top_level_scripts
place_to_put_documentation
place_to_app_data
place_to_put_user_configuration_files
place_to_put_user_system_configuration_files
.

Then the python distro would map these to actual paths at install time: gnu
systems would map the gnu locations, Windows to
Windows-appropriate locations, OS-X to OS-X locations, etc This wold
also allow python distros like Anaconda or macports python, or ??? to do
their own thing, which may not be putting everything in /usr/local, or ...

That may be what you had in mind, but I got confused.

-Chris













 Cheers,
 Nick.

 
  -Chris
 
 
 
  It would be really easy
  to try adding these to the dict that wheel installers use to relocate
  stuff.
 http://www.gnu.org/prep/standards/html_node/Directory-Variables.html
 
 
 
 
  --
 
  Christopher Barker, Ph.D.
  Oceanographer
 
  Emergency Response Division
  NOAA/NOS/ORR(206) 526-6959   voice
  7600 Sand Point Way NE   (206) 526-6329   fax
  Seattle, WA  98115   (206) 526-6317   main reception
 
  chris.bar...@noaa.gov
 
  ___
  Distutils-SIG maillist  -  Distutils-SIG@python.org
  https://mail.python.org/mailman/listinfo/distutils-sig
 




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Packaging today

2014-01-06 Thread Chris Barker
On Mon, Jan 6, 2014 at 12:26 PM, Steve Spicklemire st...@spvi.com wrote:


 avoid troubling anyone with pesky questions. In that respect I've
 apparently failed, because here comes the question!


I think this is a use case that  is quite useful for us all to chew on a
bit...

1st -- yes Anaconda refers to the distribution from Continuum. Note that
conda is the packaging system that Anaconda uses, but it can also be used
independently of the distribution.


 I'm helping out with a python package: vpython http://vpython.org


[side note: fairly recently  a por tof VPython to wxPython was done -- is
that what you are using? Notable because I think wxPython is still not
available for Anaconda...)


 and I'm also teaching an intro scientific computing class this spring. I'm
 mostly a Mac/Linux user, but my students are often windows users. I would
 love to permit my students to use enthought/canopy and/or continuum
 analytics (C.A.) along with vpython.


Either/or? As an instructor, I'd recommend you pick one and go with it --
if you need wx, that means Canopy for now. Alternatively, you suggest the
python.org builds, and point your users to binaries they can get elsewhere
(Chris Gohlke's site for Windows...)

At the moment we're creating binary releases of vpython for windows and mac
 and posting them on sourceforge 
 https://sourceforge.net/projects/vpythonwx/.


Are these for the python.org builds? good for you!

Bruce has been building the windows binary using VC (no setup.py) in a way
 that's compatible withpython.org python for windows. I've been building
 the mac version using a setup.py script I cobbled together that works on
 MacOSX and Linux.


Why not distutils for building Windows? I find it really helpful.


 I've noticed that the anaconda system that C.A. installs uses MinGW on
 windows to build extensions.


I think Canopy does that too -- at least it did a few years ago. but I
_think_ you can build extensions with either MinGW or MSVC for the same
binary python -- if it's set up right ;-)

I'd love to figure out how to build vpython under this system so that my
 windows users could use them together transparently.


You want to take a look at conda:

https://github.com/pydata/conda

If you can build a conda recipe then you are set to go...

That being said, it is supposed to be a goal for Anaconda to be binary
compatible with the python.org binaries -- so you may well be able to build
the way you are, and give the users a way to install it into Anaconda. In
theory, binary wheels are the way to do this.

I'm pretty sure I could work out how to build vpython with continuum
 analytics on the mac (which means building boost + wxPython using the C.A.
 python).


ahh -- you are using wx -- I'd check with the CA folks and see where they
are at -- they said they were working on a wxPython packageand I'm sure
they'd like help and testing...


 Is there any way, *today*, to incorporate dependencies on external
 libraries (e.g., boost) in setup.py?


no -- except by hand with custom code.


I'm still a little unclear on your goals here. If you want to simple be
able to tell your students to use Anaconda  then look into conda and the CA
help lists -- and conda is more or less designed to solve these sorts of
problems. also, the odds are good that Anaconda already has boost, and if
not someone has done a conda recipe for it:

https://github.com/faircloth-lab/conda-recipes/tree/master/boost

If you want your users to be able to use any of:

Anaconda
Python.org python
Canopy

Then I'd look into building binary wheels, and see if you can get them to
work with Anaconda and Canopy.

Note: distutils does not address the third-party non-python dependency
problem -- conda tries does address it (though still not clear to me if it
helps you build them...) You may also want to look at Gattai, which aims to
help you build  them:

http://sourceforge.net/projects/gattai/

HTH,
  -Chris




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Packaging today

2014-01-07 Thread Chris Barker
On Mon, Jan 6, 2014 at 3:20 PM, Steve Spicklemire st...@spvi.com wrote:

 Thanks Chris for the detailed reply.


Well, I'm trying to sort out similar issues myself

Right. My impression is/was that python.org/CA/Canopy were all different
 builds of python that were *not* interoperable.


well, in the case of Anaconda, Travis told me the intent was that it would
be -- the reality, I'm not sure about.



 So that a binary built with one could not generally be expected to work
 with another. If that's not true, then maybe this is a non-problem. I guess
 I should just try it and see what happens.


it's still tricky to get things to install correctly -- the Windows (and
Mac) installers expect python to be in a particular location -- Anaconda is
not there.

 Why not distutils for building Windows? I find it really helpful.

 I don't even have a 'real' windows system (only VirtualBox) and I don't
 have VC Studio,


you should be able to do it with the free visual studio express 2008. A bit
hard to find an installer these days, but I think it's still there. I've
had much better luck with that than MinGW.

 Do you think the build-wxpython.py script would work under windows with
 MinGW? I guess that's probably kind of a naive hope. ;-)


I doubt it --- but again VS2008 Express might build it OK -- but then Robin
provides installers for wx anyway.


 Well I guess I am too. I was impressed with CAs ability to use 'pip' on
 windows to install plotly right away. It's almost like working in unix. I
 liked that!



is plotly pure python? in that case, then it's pretty easy, really.

/anaconda/bin/pip install vpython


pip install with compiled binaries is a different beast -- I _think_ pypi
is now set up to find binary wheels that match the python.org python. I
have no idea if those will install under Anaconda. But you probably want
conda install vpython if you want Anaconda anyway.



 and it would just work.

 I understand that's impossible at the moment. But if I could create
 instructions and/or build a set of binary files a student could easily
 install that would give them:

 1) vpython
 2) matplotlib
 3) ipython
 4) scipy


we're pretty close to having all these as binary wheels now. There isn't
much stopping it. vpyton is up to you. But wx is not there -- though if you
can get it to build on Windows, making a wheel of it should be easy. I
imagine Robin would be happy to put them up in PyPi.

I'd be ecstatic. I'll also check Chris Gohlke's site. Maybe I don't need
 all the bells and whistles of Canopy/CA etc.


That is a GREAT resource!

Ultimately I'd like to help Bruce package vpython in such a way that folks
 can use 'pip' to include wx and vpython in whichever python distribution
 they happen to choose without a lot of fuss.


It's not really a full-on goal for Anaconda or Canopy to be fully pip
compatible -- so that may be a bit of a fantasy...

Also: you can point pip at a custom wheelhouse -- i.e. a collection of
wheels that you put together.

In your position, Id be tempted to provide a full set of wheels for the
python.org build for everything that you need that isn't already
pip-installable. Then point your students to that.

If you're lucky, those some wheels may work with Anaconda, or even Canopy.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Packaging today

2014-01-08 Thread Chris Barker
On Wed, Jan 8, 2014 at 1:48 AM, David Cournapeau courn...@gmail.com wrote:

 We don't use mingw to build packages distributed within canopy (at least
 not anymore). We build everything with MSVC 2008, as mixing mingw/MSVC
 often causes trouble.


so is Canopy binary-compatible with the python.org builds?

i.e would the same binary wheel work for both?

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Packaging today

2014-01-10 Thread Chris Barker
What David said, plus:

On Thu, Jan 9, 2014 at 10:43 PM, Steve Spicklemire st...@spvi.com wrote:

 So, related question: Should the Mac binaries also work with the
 python.org mac build?


Not sure what also is respect to, but the pyton.org builds are a good
common denominator:

The Apple builds have their issues:
  - Apple never upgrades them
  - You can't re-distribute them (build stuff with Py2app) .. at least
without violating copyright.

While it's appealing for people to not have to install anything, if they
are installing 3rd party packages  they are installing stuff, son one extra
install at the start is not a big deal.

So: if you are going to support one binary -- it should be the python.orgone.

It is a bit of a pain to build binaries for the Python.org builds, as they
are universal and support older OS versions.

Personally, I think we should address this by:

1) having a centralized project for building varios binary dependencies
that are compatible with the python.org builds -- why should multiple
package distributors all have to figure out how to build, e.g. freetype
correctly?

I've created a repo for this, but haven't gotten far:

https://github.com/MacPython/osxinst

let me know if you are interested in contributing.

2) Maybe it's time to put out an official python.org build that's
simpler: perhaps 10.7+ 64bit Intel only. But I'm not sure how many folks
still need 32 bit.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] pip on windows experience

2014-01-23 Thread Chris Barker
On Thu, Jan 23, 2014 at 12:25 PM, Thomas Heller thel...@ctypes.org wrote:

 Did I say this before?  I would suggest that numpy develops a way
 where all the SSE binary variations would be installed, and the
 appropriate ones be loaded at runtime, depending on the user's CPU
 capabilities.  This would also allow py2exe'd distributions to include
 them all.


That was discussed on the numpy list, and would be really nice, but it may
also be really difficult. OS-X has built-in support for
multi-architecture binaries, but Windows does not, and while selecting a
particular .dll (or .pyd) to load at run-time would be
fairly straightforward  numpy  has more than one, and then there is the
whole scipy stack, and all the third-party stuff compiled against it.

I suspect this wold have to be built-in to the python importing and
distutils build system to be workable. But maybe someone smarter than me
will figure it out.


Some feedback from the people who did try those wheels would help. I
 asked for that on the numpy list after creating them, but didn't get
 much. So I haven't been in a hurry to move them over to PyPi.


Serious chicken-egg problem there


  I would have tried wheels for windows, python 3.3 or 3.4, but there
 aren't any.


Yeah we need to get those up -- SSE2 only ones would work for MOST people.

-CHB


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] pip on windows experience

2014-01-24 Thread Chris Barker
On Fri, Jan 24, 2014 at 2:18 AM, Nick Coghlan ncogh...@gmail.com wrote:

 In return, as Paul points out, it becomes substantially easier for people
 that *aren't* wholly invested in the scientific Python stack to try it out
 with their regular tools, rather than having to completely change how they
 work with Python.

This is a really important constituency, actually. And one that has been
neglected for a while.

 Also consider that, given the status quo, any users that might see that
 new error instead get even *more* incomprehensible errors as pip attempts
 to build NumPy from source and fails at doing so.

well, numpy _should_ build out of the box with nothing special if you are
set up to build regular extensions. I understand that a lto f Windows users
are not set up to build extensions at all, but tehy ar presumably used to
getting compiler not found errors (or whatever the message is). But you
won't get an optimized numpy and much of the rest of the stack is harder
to build: scipy, matplotlib.

So a set of working binary wheels would be great. And while  we in the
numpy commmunity don't really want a lot of numpy is slower than MATLAB
FUD out there, I still think it's better to get a sub-optimum, but working
build out there. The should I use python instead of MATLAB? crowd would
be better served by one of the other options anyway...

So how rare are non-SSE2 systems? Any w ay to find out? Im guessing rare
enough that we can a) not worry about it, and b) those users will know they
have an old system and may expect
issue,s particularly with something billed as being for
high-performance computation.

So I say SSE2 -- but if we do think there ar a lot of non-SSE2  users out
there, then do SSE1-only , it would still work just fine for casual use.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] pip on windows experience

2014-01-24 Thread Chris Barker
On Fri, Jan 24, 2014 at 2:40 PM, Paul Moore p.f.mo...@gmail.com wrote:

 So no, numpy does not build out of the box. Ah well.


Darn -- it used to, and it should. It has shipped for years with an LAPACK
light, and shouldn't need any fortran. It used to not even look for LAPACK
with a default configuration.

But I haven't done if or years, so who know when this might have been
broken?

-CHB






-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] pip on windows experience

2014-01-29 Thread Chris Barker
On Sat, Jan 25, 2014 at 4:29 PM, Nick Coghlan ncogh...@gmail.com wrote:

 To put the but what if the user doesn't have SSE2 support? concern in
 context, it should only affect Intel users with CPUs older than a Pentium 4
 (released 2001), and AMD users with a CPU older than an Opteron or Athlon
 64 (both released 2003). All x86/x86_64 CPUs released in the past decade
 should be able to handle SSE2 binaries, so our caveat can be if your
 computer is more than a decade old, 'pip install numpy' may not work for
 you, but it should do the right thing on newer systems.

Exactly

  However, from my perspective, having NumPy readily available to users
 using the python.org Windows installers for Python 3.4 would
 *significantly* lower the barrier to entry to the Scientific Python stack
 for new users on relatively modern systems when compared to the 4 current
 options

+1

with a note: This isn't just for users of the SciPy Stack -- there are LOT
of use-cases for just numpy by itself. Not that I don't want folks to have
easy access of the rest of the stack as well -- just sayin'

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] pip on windows experience

2014-01-29 Thread Chris Barker
On Wed, Jan 29, 2014 at 2:04 PM, David Cournapeau courn...@gmail.comwrote:

 I think the SSE issue is a bit of a side discussion: most people who care
 about performance already know how to install numpy. What we care about
 here are people who don't care so much about fast eigenvalue decomposition,
 but want to use e.g. pandas. Building numpy in a way that supports every
 architecture is both doable and acceptable IMO.


Exactly -- I'm pretty sure SSE2 is being suggested because that's the
lowest common denominator that we expect to see a lot of -- if their really
are a lot of non-SSE-2 machines out there we could leave that off, too.

 Building numpy wheels is not hard, we can do that fairly easily (I have
 already done so several times, the hard parts have nothing to do with wheel
 or even python, and are related to mingw issues on win 64 bits).


David,

Where is numpy as with building out of the box with the python.org binary
for Windows, and the standard MS compilers that are used with those
builds. That used to be an easy python setup.py install away -- has that
changed? If so, is this a known bug, or a known we-aren't-supporting-that?

i.e. it would be nice if anyone setup to build C extensions could just
build numpy.

-Chris

Just to clarify: you actually can install numpy on windows with
python.orginstallers fairly easily by using easy_install already (we
upload a
 bdist_wininst compatible binary which should not use any CPU-specific
 instructions). It looks like those are missing for 1.8.0, but we can fix
 this fairly easily.


presumably just as easy to do a binary wheel then -- I vote for that.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] OS X and PEP 425 / wheels

2014-03-06 Thread Chris Barker
On Thu, Mar 6, 2014 at 4:27 PM, MinRK benjami...@gmail.com wrote:

 I proposed a patch https://github.com/pypa/pip/pull/1465 to pip, with
 respect to treatment of the platform tag on OS X, and Chris Barker proposed
 that I bring the discussion here.


Note -- there is some more discusion on that patch...

  The situation:

 PEP 425 describes the platform tag as:

 The platform tag is simply distutils.util.get*platform() with all hyphens
 - and periods . replaced with underscore *.

 but the PEP makes no mention of what should be done on OS X. On OS X,
 get_platform() has the form:

 macosx_10_6_intel


 .


1.
2.

support multi-arch names (intel, universal) on their respective
components
 - intel is valid on {x86_64, i386}
   - universal is valid on {intel, x86_64, i386, ppc64, ppc}

easy_install, like pip, also does strict comparison here, so this
would be new behavior.

 yup -- and easy_install was actually quite broken for binary eggs for a
universal build -- so this would be great.

though a note: universal:, as used by the python.org builds means
only  i386+ppc -- in theory once could do a quad build, but no on ever did.
Though ppc is pretty old now -- I don't think we need to worry about that
for anything future-looking anyway.

I have a wheel (pyzmq), which works on any intel-based Python targeting OS
 X = 10.6. To express this with pip-1.5, the filename has to be:


 pyzmq-14.1.0-cp27-none-macosx_10_6_intel.macosx_10_6_x86_64.macosx_10_6_i386.macosx_10_7_intel.macosx_10_7_x86_64.macosx_10_7_i386.macosx_10_8_intel.macosx_10_8_x86_64.macosx_10_8_i386.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_9_i386.whl

 and it has to grow every time there is a new OS release.

clearly no ideal!

Any feedback from experts, especially if my understanding of deployment
 targets or fat binaries is incorrect, would be much appreciated.

I'm no expert, but his looks good to me.

As pointed out in the comments on the patch, there may be some issues with
C++  extensions and building on 10.9 -- we'll have to see how that turns
out

-Chris





-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] OS X and PEP 425 / wheels

2014-03-07 Thread Chris Barker
On Fri, Mar 7, 2014 at 9:50 AM, Brian Wickman wick...@gmail.com wrote:

 I've also run into similar issues.  What I do with PEX is fudge PEP425
 tags for OS X in order to be more correct:

 https://github.com/wickman/commons/blob/wickman/pep425/src/python/twitter/common/python/pep425.py



 I'd love if some variation of this code could be added to setuptools or
 whatever.


+1 This looks good to me.

In theory, this logic should really go with python itself, not a
third-party lib. i.e., once built, a Python implementation will match
particular tags -- having a third party lib keep track of that seems to be
putting it in the wrong place.

In practice, there are only so many versions out there, and we're only
trying to support some of them, so it's probably fine for it to be in pip
(and it can be upgraded so much more easily there)

-CHB


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Is build an inherently arbitrary-code process?

2014-03-28 Thread Chris Barker
On Thu, Mar 27, 2014 at 2:23 PM, Nick Coghlan ncogh...@gmail.com wrote:

 On 28 Mar 2014 05:42, Daniel Holth dho...@gmail.com wrote:
  I became convinced that build was an inherently arbitrary-code
  process, and not something to be universally handled by a declarative
  system,

 It wasn't an accident that last years PyCon panel was subtitled setup.py
 *install* must die :)

 As others have suggested, declarative build will never be more than an 80%
 solution, and then beyond that, it should be a question of plumbing to
 invoke the project's own build system (and ideally that plumbing will be
 per build system, not per project).

Agreed -- I have been poking at this a bit, and trying to make gattai work
for me:

http://sourceforge.net/projects/gattai/

(It's a by and for python build system that essentially invokes the native
build systems : make, setup.py, nmake, etc... But that's not really the
point). the point is that I hit a wall pretty quickly with
its declarative approach (JSON files). I find I can't do what I need to do
with straight declaration (and I'm hacking gattai to support that)

However, for the most part, all I need to be able to do is run arbitrary
code to set up the declarations. The stuff that's been talked about:
finding the right libraries to link to, that sort of thing.

I actually think that aspect of the setup.py approach works pretty well,
even though it does sometimes get kind of messy.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] ctypes and shared libs, and wheels, oh my!

2014-06-11 Thread Chris Barker
Folks,

I'm trying to help figure out how to do binary wheels for a package that
relies on ctypes and a bundles shared lib (dll, .so. etc)

The trick here is that the python code is quite version and platform
independent: py2 and py3, version 2.7 and 3.3+ (I think)

(it's py_enchant, if anyone is interested:
http://pythonhosted.org/pyenchant/)

So the trick is that the binary wheel will be platform dependent, but not
the code itself, so ideally we'd have one wheel, that for instance (and teh
case at hand) should work on any OS-X box version 10.6 and above, with a
any of python2.7, 3.3, 3.4 (an up?)

Usually, a binary wheel involves  compiled extensions, and thus is tied to
a particular python version -- so this is an odd case.

We tried:

pyenchant-1.6.6-py2.py3-none-macosx_10_6_intel.whl

which seems to be saying: any version of python2 or python 3, but only on
macosx 10.6

but trying to install that on my machine (py2.7, os-x 10.6) gives:

pyenchant-1.6.6-py2.py3-none-macosx_10_6_intel.whl is not a supported wheel
on this platform.

(side note: it would be really great if that could be a more useful message
-- what part of the file name didn't match? I know that's a trick, as there
is a whole pile of heuristics to go through, but maybe a way to dump that
process would be helpful...)

Now, this may, in fat be tied to CPython (I have no idea if ctypes
is available on pypy or jython or IronPython...). So I tried:

pyenchant-1.6.6-cp27-none-macosx_10_6_intel.whl

that does, indeed, install on my system.

Also:

pyenchant-1.6.6-cp27.cp33-none-macosx_10_6_intel.whl

works.

As 2.7 is really the only py2 that much matters, no biggie, but is there a
way to get 3.3 and 3.4 and ??? all at once (I don't have py3 on that
machine, so didn't test that...)

So: how should this be done? Is the above the best option there is?

Thanks,
  -Chris














-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] ctypes and shared libs, and wheels, oh my!

2014-06-11 Thread Chris Barker
On Wed, Jun 11, 2014 at 9:09 AM, Daniel Holth dho...@gmail.com wrote:

 This is in the bug tracker already. We need to add the py2-none-arch tags
 etc. to Pip's list.

Great, thanks. Is the idea that:

pyenchant-1.6.6-py2.py3-none-macosx_10_6_intel.whl

should have worked? And will in some future version?

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] ctypes and shared libs, and wheels, oh my!

2014-06-11 Thread Chris Barker
On Wed, Jun 11, 2014 at 9:50 AM, Daniel Holth dho...@gmail.com wrote:

 If you insert those tags into this list, in the
 pip.pep425tags.get_supported() function:
 https://github.com/pypa/pip/blob/develop/pip/pep425tags.py#L38 then
 your wheel will become installable.

 I'd probably put it right after these:
 https://github.com/pypa/pip/blob/develop/pip/pep425tags.py#L78


OK, thanks -- that's going to take some poking around.

Maybe once we have the build actually working, I'll take a look at that.

-Chris





  On Wed, Jun 11, 2014 at 12:32 PM, Chris Barker chris.bar...@noaa.gov
 wrote:
  On Wed, Jun 11, 2014 at 9:09 AM, Daniel Holth dho...@gmail.com wrote:
 
  This is in the bug tracker already. We need to add the py2-none-arch
 tags
  etc. to Pip's list.
 
  Great, thanks. Is the idea that:
 
  pyenchant-1.6.6-py2.py3-none-macosx_10_6_intel.whl
 
  should have worked? And will in some future version?
 
  -Chris
 
 
  --
 
  Christopher Barker, Ph.D.
  Oceanographer
 
  Emergency Response Division
  NOAA/NOS/ORR(206) 526-6959   voice
  7600 Sand Point Way NE   (206) 526-6329   fax
  Seattle, WA  98115   (206) 526-6317   main reception
 
  chris.bar...@noaa.gov




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] build a wheel with waf instead of setuptools

2014-07-30 Thread Chris Barker
On Fri, Jul 25, 2014 at 7:21 AM, Daniel Holth dho...@gmail.com wrote:

  This kind of thing will require us to implement a flag that tells pip
  setup.py cannot install; go through wheel which is somewhere in the
  plans..


couldn't you write a file called setup.py, with the core API (i.e
setup.py build | install), but that called waf instead of distutils to do
the actual work?

or does pip doe more than simply call the setup.py script?

-Chris





   I don’t think there is any plans to tell pip *not* to use a setup.py
 and to
  use a Wheel instead. Rather I think the plans are to enable pluggable
  builders so that a sdist 2.0 package doesn’t rely on setup.py and could
 use
  a waf builder (for instance) plugin.

 Just a flag that tells pip it can't use the install command and has
 to do package - install package on an sdist.
 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Building Python extensions on 64-bit Windows using the SDK compilers

2014-09-24 Thread Chris Barker
On Wed, Sep 24, 2014 at 6:55 AM, Paul Moore p.f.mo...@gmail.com wrote:

 Thanks for the pointer. (Also thanks to Allen Riddell). I'll take a
 look. Ideally, what I'd like to do is write something up to help
 non-Windows experts get things up and running, so this will be very
 useful.


Thanks -- that would be great. But really, why is this so hard? Win64 is
essentially One platform, and the freely available SDK is ONE compiler
environment.

surely it's possible to write a batch script of some sort that you could
put somewhere (or even deliver with python! ) so this would be:

1) download and install THIS (the sdk from MS)

2) run:
set_up_win_complier.py

3) build the package:
python setup.py build

without needing to do multiple step, without needing to be in the special
set-up command Window, etc.

In fact, even better would be for distutils to run the mythical
set_up_win_complier.py script for you.

distutils does work out of the box with the VS2008 Express for 32 bit --
I'm still confused why this is so much harder for 64 bit.

*sigh*

-Chris






-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Building Python extensions on 64-bit Windows using the SDK compilers

2014-09-24 Thread Chris Barker
On Wed, Sep 24, 2014 at 11:49 AM, Paul Moore p.f.mo...@gmail.com wrote:

  essentially One platform, and the freely available SDK is ONE compiler
  environment.

 If only that were true :-)

 What I've found is:

 1. Different SDKs are needed for Python 2.7 and 3.3+ (the VS2008/VS2010
 split)


well, yeah, but that's not the problem at hand -- that one is ugly and
painful and always has been :-(


 2. The v7.0 SDK (Python 2.7) is a bit of a beast to install correctly
 - I managed to trash a VM by installing the x86 one when I should have
 installed the x64 one.


Ah, what fun -- though if you DO install the right one, hopefully it will
work, at least if it's installed with defaults, which most folks can do.


 3. There are bugs in the SDK - the setenv script for v7.0 needs fixes
 or it fails.


OK -- that sucks and is simply going make this painful -- darn it!

Agreed, it should be easy. And indeed, it is if you have the full
 Visual Studio. But when Python 2.7 came out, the freely available MS
 tools were distinctly less convenient to use, and that shows.


and they still are, too.

It's getting a lot better, and once we start using MSVC 2012 or later
 (i.e., Python 3.5+), the express editions include 64-bit support out
 of the box, which makes most of the problems go away.


Sure, but is there something we can do with the old stuff -- some of us
will be ruing 2.7 for a good while yet!

 Steve wrote:
As I mentioned at the start of this thread - hold your frustration and wait
for a little while :)

It wasn't clear -- will things get better for 2.7 ? OR just the new stuff?

i.e. frustration aside, should I not bother to wrangle this now for my
projects if I can hold off a bit?

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Building Python extensions on 64-bit Windows using the SDK compilers

2014-09-26 Thread Chris Barker
On Thu, Sep 25, 2014 at 8:15 AM, David Cournapeau courn...@gmail.com
wrote:

 The SDK scripts are indeed a bit broken, but it is possible to detect them
 automatically in a way that is similar to what was done for MSVC 2008.

 I know that for a fact because I ported the python distutils MSVC
 detection to scons, and added support for the SDK there:
 https://bitbucket.org/scons/scons/annotate/b43c04896075c3392818e07ce472e73cd6a9aca5/src/engine/SCons/Tool/MSCommon/sdk.py?at=default
 (the code has changed since then).

 Is that the kind of thing that falls onto long term support for 2.7 ? If
 so, I would be willing to work it out to put in distutils.


Yes please! That would be great.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Wheels and dependent third party dlls on windows

2014-10-01 Thread Chris Barker
On Wed, Oct 1, 2014 at 9:44 AM, David Genest david.gen...@ubisoft.com
wrote:

 - We are merely writing extension modules with third party dependent code
 packaged in a dll. In my mind, this use case is not the exception, and
 would not necessarily warrant the use of a full blown solution like conda.


agreed -- it is not rare, so yes, it would be nice if the core python
(pypa) systems addressed it. But like David said, Windows makes this really
hard...

 - If you run python setup.py bdist_wheel, the dlls specified in the
 scripts parameter end up in the wheel archive and does what is needed for
 our setup. (the dlls are copied to the scripts directory which is on PATH
 for the activated environment).


If this is the PATH only for that environment, then this is probably fine.
But one of the biggest sources of dll hell is that the same PATH is used
for executables and dlls, and that dlls placed next to executables will be
found. this means that any old app could find any old dll on teh PATH, and
that there are a lot of dll on teh PATH.

So putting dlls into the python scripts or bin dir is a bad idea in
general -- who know what apps may find them?

Couple this with the (absolutely incomprehensible to me) habit of folks to
use short (still 8.3) names for dlls, without much version info, and you
really have a mess.

So if you do put your dlls into the Script dir -- do please give them nice
long descriptive names!

But isn't there a library or somethign directory where other python dlls
are that could be used instead? then you could get clashes between python
extensions, but it wouldn't clash with anything else on the system.


 In an ideal world, the scripts directory would be called bin, like the
 unix counter-part,


why does the name matter at all?


 and any dependency, being startup scripts or dlls could be installed in
 the bin/ environment global space. This path would be added to the python
 startup sequence (in order to not rely on the env's activate).


ouch -- no dlls and top level scripts don't belong in the same place,
period.

ANother option is to make a python package that has little other than that
dll in it, then yoru packaged list it as a dependency, and I _THINK_ there
is some relative path magic that you can do so that your other extensions
can find it.

Anyone know what Anaconda does on Windows?


 1) add the dependent dlls to every package that needs it (Steve's answer
 https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html
  concurs that the dependent dll would be loaded only once)


If Steve is correct, which he probably is -- this is a great way to go.
Alternatively, alter my suggestion above a bit -- and have your dll
package have a tiny extension that does nothing but link the dll in. then
everything that depends on that dll will have a import
the_funny_dll_package line at the top -- and this ends up looking just
like a regular old python dependency.

Again, make sure to use a descriptive enough name for the dll so that it
doesn't clash with other packages (Not yours) that may use the same
(similar) dll.

Does the community still think this is a I would not design my solution
 like yours use-case ? The extension modules are a really good way to
 accelerate python, so they are bound to be constructed with other dependent
 libraries. It is not only a sdist world :-), particularly on Windows.


this is common problem we'd all love to be able to solve! (and conda does
help)

and sdist doesn't help anyway -- folks need to build and install it somehow
anyway.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Wheels and dependent third party dlls on windows

2014-10-02 Thread Chris Barker
On Wed, Oct 1, 2014 at 5:44 PM, David Genest david.gen...@ubisoft.com
wrote:

 We control our environment and package only what is needed in it. This
 makes a micro system in which everything is controlled and isolated, even
 the global dlls (to the virtual env) I wanted to install.


If that is your use case, you may want to take a good lok at conda -- that
is exactly what it is for -- why re-invent the wheel? ( sorry ).

Note that while conda is the package manger for Anaconda, it can also be
used to build your own distribution, you wouldn't need to adopt Anaconda as
a platform.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] a package with is a module

2014-10-28 Thread Chris Barker
On Mon, Oct 27, 2014 at 8:30 AM, Marius Gedminas mar...@pov.lt wrote:

 Many setup.py files fail if you run them when your working directory is
 not the one where the setup.py resides itself.  I think you can safely
 rely on that implementation detail.


agreed, but if you really wanted to be sure you could use __file__ to get
the path of the setup.y and go from there.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Pypi to enforce wheel, python3 - is it possible?

2014-11-05 Thread Chris Barker
no, wheels should not be required -- encourage, absolutely, but required,
no.

 My experience so far tells me otherwise. Our of 7 or so libraries that I
 tried to convert to wheel files that salt stack depends on only 2 were
 using setuptools, others were using distutils and had sometimes quite big
 setup.py files and were not compiling out of the box, and frankly I have no
 idea how I can create a wheel from a setup.py file that is taking 10
 screens to scroll and not using the packager that is supporting wheel and
 has dependencies on some C stuff.

Exactly -- some stuff is hard to build -- period, end of story. Ideally,
everyone would make binary wheels for al platforms for these, but even that
is next to impossible universally (see the conda package manger -- it's
been developed for a reason).

So should we say: if you haven't made a wheel out of  it, it can't be on
PyPi? that seems like abad idea overall.

Also -- if wheels were required, what versions/platforms etc. would be
required? That's a rabbit hole we should avoid!

 It is usually pretty easy to build from sdist. Wheels are convenient but I
 don't think they should be required.

 Actually, if it's easy to build from sdist, then it's easy to make wheels
;-)

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] Role of setuptools and eggs in modern distributing...

2014-12-23 Thread Chris Barker
Hi folks,

I'm trying to package up a complex system and would like to do it the
correct, modern way.

In particular, this involves a bunch of compiled extensions, as well as
dependencies on both the scientific stack and common Web app packages.

(can you tell I'm building a web service front-end for computational code?)

This is actually a bit of a trick, because the web development community is
generally doing a good job up supporting PyPi and pip, whereas the
complications of the scientific software tools have folks relying more on
tools like Enthought and Continuum.

So far, we've been doing mostly pip and struggling with build our own for
the ugly scientific stuff (whoo hoo, fun with HDF and netcdf, and GDAL,
and). But at the end of all this we'd like to be able to distribute and
make it easy on end users to use our tools.

I figure we we'll do one (or both) of:
- providing a custom wheel house with our packages and the dependencies
that are hard to come by
- provide a binstar channel with conda packages of all the same stuff but a
totally different set of other packages.

At the moment, I'm working on making conda packages, which brings me to my
questions.

I'm a bit confused about the role of setuptools with pip. On the one hand,
pip depends of setuptools. On the other hand, pip supposed doesn't do
eggs, and prefers wheels. But I find that I keep getting eggs whether I
want them or not. IN fact, as far as I can tell, the way to get pip to
instal something from git repo is:

git+https://url_to_the_repo.git#egg=name_of_package

why isn't that wheel=name_of_package

and will it work if setuptools was not used in the packages setup.py???

Frankly I've generally found setuptools and eggs to be overly heavy weight
and annoying -- the only reason I use setuptools at all is that I REALLY
like develop mode -- which I can now get with pip --editable or  does
that give me setuptools develop mode anyway, i.e. do I need to have used
setuptools.setup for it to work?

So question one: do I need to use setuptools.setup rather than plain old
distutils.setup?

Question 2:

What about setuptools: install_requires

I generally like the pip requirements.txt approach. It's up to the
installation tool, not the packaging tool to mange requirements. But then
again, it does make sense to declare the requirements in setup.py. But the
issue at hand is that install_requires is doing some odd things with conda:

conda, of course, is designed to manage dependencies itself, and those are
declared in the conda build meta.yaml file. Note that conda dependencies
can have nothign to do with python -- the whole point of conda -- a conda
pacakge can depend on any other conda package, including C libs, etc.

But the issue at hand is that conda build doesn't re-invent setup.py -- but
rather you generally simple call setup.py install from your conda build
script. Hence th issue at hand:

Using setuptools.setup, and specifying install_requires, then kicks in
setuptools trying to resolve dependencies I don't want it to deal with.

I read Donald's setup.py vs requirements.txt, and I guess I get it, but
it still seems quite redundant -- I see the point of separating out
 “abstract dependencies” and  “concrete dependencies”. However, the nifty
solution of only putting:

--index-url https://pypi.python.org/simple/

in the requirements.txt doesn't work for complex cases. In practice, we
find we need to specify version numbers of some dependencies, and have some
go strait into a git repo, etc. So we need a messy requirements.txt file.

And, in fact, I think that's where is belongs -- the goal of the
requirements.txt file is so pip can do the right thing and go grab
everything you need. But, in fact, it also is quite human readable, and
serves quite well as the documentation for the abstract dependencies as
well.

So I think the way to go is to keep requirements in requirements.txt, and
leave them out of the setup.py.

Can I dump setuptools.setup, as well??

Sorry, this is a bit rambling and short of clear questions, but I'm trying
to get a handle on what best practices really are these days...

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Role of setuptools and eggs in modern distributing...

2014-12-24 Thread Chris Barker
On Tue, Dec 23, 2014 at 10:37 AM, Donald Stufft don...@stufft.io wrote:

 I’m going to attempt to read between the lines here a little bit.


Thank you -- you did an excellent job of capturing the gestalt of my
confusion !

The “egg” name is heavily overloaded in setuptools. It is used all over the
 place for varying sets of related but distinct concepts. The #egg= thing is
 one of those setuptools concepts where it used that name for something
 distinct but similar. Ideally it shouldn’t be #egg= or #wheel= but should
 be #dist= or something similar since it’s neither an egg or a Wheel and
 there is an open ticket in pip’s issue tracker to do that.


OK, that clears it up.

Though I still get egg-info files all over the place -- not sure why that
annoys me ;-)


 To make a clarification though, pip itself doesn’t depend on setuptools,
 it can install from Wheels without setuptools being installed on the system
 at all. It does however rely on setuptools to be installed if it is
 installing from a sdist. The reason for this is that pip uses setuptools as
 a build tool, so when it invokes a setup.py it’s “building that
 distribution (even if it’s just pure python it needs “built”). However pip
 does some tricks so that it will always uses setuptools to build the
 project, regardless of if the project imports setuptools or distutils in
 their setup.py.


Ah -- so pip needs to use setuptools to build, but a package doesn't have
to explicitly use it in its setup.py.

To that aim, install_requires specifies a packages dependencies as well as
 other metadata for that package, and requirements.txt is just a list of
 packages to install. The difference is subtle but a requirements.txt isn’t
 attached to a particular project and the rest of the metadata like name,
 version, etc.


hmm...I agree, but often shipped alongside setup.py -- kind of like the
fact that the name setup.py is a conventions rather than a spec, but
expected all over the place.


On Tue, Dec 23, 2014 at 11:17 AM, Marcus Smith qwc...@gmail.com wrote:

which I can now get with pip --editable or  does that give me setuptools
 develop mode anyway


-e uses setuptools develop mode.

OK -- though it sounds like pip would do that whether or not I used
setuptools in the setup.py.

the main reason for setuptools is for install_requires, which is
 fundamental to pip dependency resolution.
 but in general, it offers more features and it's more maintained than pure
 distutils.
 The standard advice is to use setuptools over distutils.


OK -- still not clear how install_requires plays with conda - but it's
common enough that I think conda simply  ignores it (though not silently)

The Packaging User Guide has a breakdown of install_requires vs
 requirements files.

 https://packaging.python.org/en/latest/technical.html#install-requires-vs-requirements-files




 In brief, requirements files are usually for a whole environment,
 whereas install_requires is for a project.


A note about terminology here (both in this email and The Packaging User
Guide) -- it seems to me that install_requires is about requirements for a
package not a project, and that, in fact, requirements.txt is best used
for projects.

I guess the distinction may be that a package has a setup.py, whereas a
project is somethign you are building that requires perhaps a stack of
unrelated packages.  So you can say : if you want to run my application,
run:

pip install -r requirements.txt

first.

install_requires is critical when publishing projects to PyPI.


Good to know -- I may need to go there art some point.

So I'll go with setuptools and install_requires, and see how it all goes.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Role of setuptools and eggs in modern distributing...

2014-12-30 Thread Chris Barker
On Tue, Dec 30, 2014 at 2:21 PM, Reinout van Rees rein...@vanrees.org
wrote:

 Well, we're in a bit of the same boat here. We make django websites, which
 means pretty much well-behaved setup.py-using pure python stuff.

 The websites are heavy users of numpy/scipy/pandas/matplotlib and of the
 geo packages like gdal/mapnik/pyproj. Ouch.


yup -- this is pretty much out stack (though pyramid in our case...) Funny,
though, as coming from my background, I see it as a scipy stack app with a
little web stuff, rather than a web app that needs some scipy stuff ;-)

But the core problem here is that the scipy folks have been going to conda
and enthought to solve their pacakgeing problems, and the web folks have
been doing pip, and maybe buildout -- so you get a bit of mess when you mix
them.


 The combination we use now is to use buildout (instead of pip) in
 combination with the syseggrecipe (https://pypi.python.org/pypi/
 syseggrecipe) buildout add-on. Syseggrecipe allows us to depend
 explicitly on **system** packages for gdal/matplotlib/mapnik/scipy/numpy.
 Which takes care of most of the compilation headaches.


well, it sounds like you are simple punting -- passing off all the complex
stuff to the system, which may work well if the system is up to date linux
with the packages you need available, but pretty worthless on a Mac or
Windows box.

The scipy folks have been doing a pretty good job lately keeping up with
wheels, but there's still a big hole there for the geo stuff.(GDAL,
Shapely, Fiona)

So I've been looking at going the Anaconda route -- it provides the hard
stuff, though it turns out it's a bit ugly when using it as a development
environment for extensions liked against libs that are both in the system
and Anaconda provided.

Antoine Pitrou wrote:

 Note you can use pip inside conda environments, which works quite well
 at least for pure Python packages (start with conda install pip).


True -- though it gets a bit ugly, as then conda doesn't know about the
packages, so switching environments is a mess, and  conda can't manage the
dependencies. So not ideal. I've actually spend the last two days writing a
script that auto-grabs packages from PyPI, builds conda packages out of
them, and then uploads them to binstar -- so we can have all our
dependencies supported by conda.

I'd love it if Continuum would build a pip bridge on binstar that would
do all that automagically if you request a pip-installable package from
binstar.


 Just throwing this into the mix as a potential solution. Note that you'll
 get to explain buildout to your users, but... :-)


yup -- not sure I want to go there yet, either...

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Role of setuptools and eggs in modern distributing...

2014-12-31 Thread Chris Barker
On Wed, Dec 31, 2014 at 9:10 AM, Nick Coghlan ncogh...@gmail.com wrote:

 The problem always existed - it's the longstanding conflict between
 platform independent, language specific tooling and platform specific,
 language independent tooling.

 The former is often preferred on the developer side (since the tools are
 oriented towards building individual custom applications rather than
 maintaining a suite of applications written by different groups), while the
 latter is essential on the system integrator side (since you're dealing
 with inevitable technology sprawl in infrastructure that evolves over the
 course of years and decades).

 One of the best things coming out of the whole DevOps notion is the
 recognition that language independence and platform independence are aimed
 at solving *different problems*, and that what we need is suitable tools
 with different roles and responsibilities that interoperate effectively,
 rather than attempting to provide a single universal tool and having
 different groups of users yelling at each other for wanting to solve the
 wrong problem.

 Tools like conda and zc.buildout fit into that landscape by aiming to
 provide a platform  language independent approach to doing *application*
 level integration, which tends to have the effect of making them new
 platforms in their own right.

Indeed -- thanks for providing a clear way to talk/think about these
systems.

I guess the trick here is that we want the different level tools to work
well with each-other.

As conda started with python packages in mind, it does a pretty good job
with them. But I still find some conflicts between setuptools and conda --
in particular, if you specify dependencies in setup.py (install_requires),
it can kind of make a mess of things. conda tries to ignore them, but
somehow I've had issues, even though I had specified it all in the conda's
meta.yaml. This is probably a conda bug/issue, but I'm still trying to
figure out how to best set up a python package so that it can be built
installed with the regular python tools, and also conda...

Use case -- developing in a conda environment -- so I want to install
dependencies with conda, but the package under development with setuptools
develop mode. (conda does not (yet) have a develop mode that works...)

Oh, and there does seem to be an odd (I think non-fatal) issue with
setuptools and conda:

I have package A, with a bunch of stuff listed with install_requires

I have all these dependencies installed with conda.

When I run setup.py develop, setuptools goes and dumps all the dependencies
in easy_install.pth.

I have no idea why that is, and it's probably only annoying, rather than a
problem, but I'm not sure what will happen when I upgrade one of those
dependencies with conda.

 If you compare them to Linux distributions, then zc.buildout is a platform
 that practices the Gentoo style approach of building everything from source
 for each application, while conda uses the more common Debian/Fedora style
 of defining a binary packaging format that allows a redistributor to handle
 the build process on behalf of their users.

indeed -- and conda actually provides (to my disappointment) very little in
the way of build support -- you need to write platform dependent build
scripts to actually build the packages.

But it does provide a nice way for me to provide a full just install this
distribution of my complex, ugly, hard to build packages

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Role of setuptools and eggs in modern distributing...

2015-01-08 Thread Chris Barker
OK,

I started this thread a while back, as I was getting confused and having
issues with intermixing python, setuptools, pip, and Anaconda / conda.

Now I've figured out where i have my issue:

I'm using an Anaconda distribution at the moment. I want conda to handle
installing my dependencies, etc. for me. OK.

However, I am also developing various python packages -- this means I can't
/ don't want o build and install a conda package for them, rather, I'd like
to use setuptools develop mode.

So here's the rub:

When I call setup.py develop, setuptools apparently looks for the
install_requires packages. If it doesn't find them, it goes out and
decided to apparently pip install them: gets the source form pypi,
download, tries to compile, etc

Even if it does find them already installed, it does some annoying adding
to easy_install.pth magic for them.

This is all why I've been thinking that dependencies do not belong in
setup.py -- but rather outside of setup.py (requirements.txt), and more to
the pint, dependency handling ius a pip (or conda, or...) issue - not one
that should be handled by aw setuptools at build time.

Note that conda build usually simply calls setup.py install as well, so
this can be a problem even there (though I think it usually satisfies the
requirements first, so not so bad)

At the end of the day, however, I think the issue is not so much where you
specify dependencies, but what setuptool develop mode is doing: it should
NOT go an auto-install dependencies, particularly not install-dependencies
(maybe build dependencies are required...)

OK -- I just found the --no-deps option. So I can do what I want, but
still, I don't think it belongs there and all, and certainly would be
better to have the default be no-deps. Let pip (or conda, or...)  handle
that.

Any one running these by hand are be definition doing things by hand, let
them deal with the deps.

OK, I suppose casual users may run setup.py install, so that mode _might_
want to auto install dependencies, if somethign has to. But develop mode is
very much for developers, you really don't want this handled there.

-Chris







On Wed, Dec 31, 2014 at 9:41 AM, Chris Barker chris.bar...@noaa.gov wrote:

 On Wed, Dec 31, 2014 at 9:10 AM, Nick Coghlan ncogh...@gmail.com wrote:

 The problem always existed - it's the longstanding conflict between
 platform independent, language specific tooling and platform specific,
 language independent tooling.

 The former is often preferred on the developer side (since the tools are
 oriented towards building individual custom applications rather than
 maintaining a suite of applications written by different groups), while the
 latter is essential on the system integrator side (since you're dealing
 with inevitable technology sprawl in infrastructure that evolves over the
 course of years and decades).

 One of the best things coming out of the whole DevOps notion is the
 recognition that language independence and platform independence are aimed
 at solving *different problems*, and that what we need is suitable tools
 with different roles and responsibilities that interoperate effectively,
 rather than attempting to provide a single universal tool and having
 different groups of users yelling at each other for wanting to solve the
 wrong problem.

 Tools like conda and zc.buildout fit into that landscape by aiming to
 provide a platform  language independent approach to doing *application*
 level integration, which tends to have the effect of making them new
 platforms in their own right.

 Indeed -- thanks for providing a clear way to talk/think about these
 systems.

 I guess the trick here is that we want the different level tools to work
 well with each-other.

 As conda started with python packages in mind, it does a pretty good job
 with them. But I still find some conflicts between setuptools and conda --
 in particular, if you specify dependencies in setup.py (install_requires),
 it can kind of make a mess of things. conda tries to ignore them, but
 somehow I've had issues, even though I had specified it all in the conda's
 meta.yaml. This is probably a conda bug/issue, but I'm still trying to
 figure out how to best set up a python package so that it can be built
 installed with the regular python tools, and also conda...

 Use case -- developing in a conda environment -- so I want to install
 dependencies with conda, but the package under development with setuptools
 develop mode. (conda does not (yet) have a develop mode that works...)

 Oh, and there does seem to be an odd (I think non-fatal) issue with
 setuptools and conda:

 I have package A, with a bunch of stuff listed with install_requires

 I have all these dependencies installed with conda.

 When I run setup.py develop, setuptools goes and dumps all the
 dependencies in easy_install.pth.

 I have no idea why that is, and it's probably only annoying, rather than a
 problem, but I'm not sure what will happen when I

Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 9:45 AM, Tim Smith t...@tim-smith.us wrote:

 A way of learning about setup_requires dependencies would be helpful for
 homebrew-pypi-poet [1], which helps generate Homebrew formulae for
 applications implemented in Python.


Indeed -- conda is similar -- it provides a conda skelton pypi command,
that grabs a package from pypi and (tries to) create a conda build setup
for it.

similarly to brew, the intent is to capture and handle the dependencies
with conda's system.

I don't have anything to do with the development, but I _think_ it actually
builds the package in order to then extract the dependency meta-data -- it
would be nice to not do that.

It actually succeeds with a lot of packages without any hand-editing after
the fact, to it's not so bad!

 As Chris Barker notes, --single-version-externally-managed is a good way
 to get setuptools-based setup.py's to just install the package;
 --single-version-externally-managed hands the install process over to
 distutils. Of course, distutils setup.py's do not support the
 --single-version-externally-managed option.


yeah, conda jsut uses plain setup.py install by dfault, you have to go in
and add --single-version-externally-managed by hand to the build script.
Maybe it would be better to add that automatically, and let the few
packages that don't use setuptools remove it by hand...

-CHB




 To have a consistent CLI interface, Homebrew borrows a shim from pip's
 source to make sure we always call setuptools.setup() when we run setup.py
 so that we can hand the install back to distutils:
 https://github.com/Homebrew/homebrew/blob/master/Library/Homebrew/language/python.rb#L78-L94
 -- thanks to Donald for pointing to the right place in the pip code.

 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 9:26 AM, Ionel Cristian Mărieș cont...@ionelmc.ro
wrote:

 The --record is for making a list of installed files. You don't need it if
 you don't use record.txt anywhere.


thanks -- Ill take that out... This was a cut and paste form teh net after
much frustration -- once I got somethign that worked, I decided I was done
-- I had no energy for figuring out why it worked...


 As for --single-version-externally-managed, that's unrelated to your
 setup_requires pain - you probably already have the eggs around, so they
 aren't redownloaded.


well, what conda does to build a package is create a whole new empty
environment, then install the dependencies (itself, without pip or
easy_install, or...), then runs setup.py install (for python packages
anyway). In this case, that step failed, or got ugly, anyway, as setuptools
didn't think the dependent packages were installed, so tried to install
them itself -- maybe that's because the dependency wasn't installed as an
egg?

I can't recall at the moment whether that failed (I think so, but not sure
why), but I certainly didn't want all those eggs re-installed.


 What --single-version-externally-managed does is force the package to
 install in non-egg form (as distutils would).


hmm -- interesting -- this really was a dependency issue -- so it must
change _something_ about how it looks for dependencies...


 That also means only setup.py that uses setuptools will have the
 --single-version-externally-managed option available.


yup -- so I need to tack that on when needed, and can't just do it for all
python packages...

Thanks -- that does make things a bit more clear!

-CHB





 Thanks,
 -- Ionel Cristian Mărieș, http://blog.ionelmc.ro

 On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker chris.bar...@noaa.gov
 wrote:

 On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș 
 cont...@ionelmc.ro wrote:

 ​Worth considering​, if you can afford it, to have a local patch that
 you apply before building. Then you have all the necessary fixes (like
 remove the setup_requires) in that patch file.


 yup -- that's a option -- but a really painful one!

 I did, in fact, find an incantation that works:

 $PYTHON setup.py install --single-version-externally-managed
 --record=/tmp/record.txt

 but boy, is that ugly, and hard to remember  why not a --no-deps flag?

 (and I have no idea what the --record thing is, or if it's even
 neccessary...

 -Chris


 This is a popular approach in Debian packages - they can have all kinds
 of fixes for the upstream code.



 Thanks,
 -- Ionel Cristian Mărieș, http://blog.ionelmc.ro




 --

 Christopher Barker, Ph.D.
 Oceanographer

 Emergency Response Division
 NOAA/NOS/ORR(206) 526-6959   voice
 7600 Sand Point Way NE   (206) 526-6329   fax
 Seattle, WA  98115   (206) 526-6317   main reception

 chris.bar...@noaa.gov





-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 9:56 AM, Chris Barker chris.bar...@noaa.gov wrote:

 On Thu, Mar 19, 2015 at 9:26 AM, Ionel Cristian Mărieș cont...@ionelmc.ro
  wrote:

 The --record is for making a list of installed files. You don't need it
 if you don't use record.txt anywhere.


 thanks -- Ill take that out...


Actually, I took that out, and got:

running install
error: You must specify --record or --root when building system packages

so it's needed I guess.

By the way, the error I get if I do a raw setup.py install is:


RuntimeError: Setuptools downloading is disabled in conda build. Be sure to
add all dependencies in the meta.yaml  url=
https://pypi.python.org/simple/petulant-bear/r
Command failed: /bin/bash -x -e
/Users/chris.barker/PythonStuff/IOOS_packages/conda-recipes/wicken/build.sh


so setuptools is trying to install petulant-bear, but conda has disables
that. But it is, in fact installed, conda having done that to prepare the
environment.

So this is why I just want to tell setuptools to not try to download and
install dependencies

But we're getting off topic here -- should probably put in a feature
request for --no-deps for install and build commands.

-CHB





 This was a cut and paste form teh net after much frustration -- once I got
 somethign that worked, I decided I was done -- I had no energy for figuring
 out why it worked...


 As for --single-version-externally-managed, that's unrelated to your
 setup_requires pain - you probably already have the eggs around, so they
 aren't redownloaded.


 well, what conda does to build a package is create a whole new empty
 environment, then install the dependencies (itself, without pip or
 easy_install, or...), then runs setup.py install (for python packages
 anyway). In this case, that step failed, or got ugly, anyway, as setuptools
 didn't think the dependent packages were installed, so tried to install
 them itself -- maybe that's because the dependency wasn't installed as an
 egg?

 I can't recall at the moment whether that failed (I think so, but not sure
 why), but I certainly didn't want all those eggs re-installed.


 What --single-version-externally-managed does is force the package to
 install in non-egg form (as distutils would).


 hmm -- interesting -- this really was a dependency issue -- so it must
 change _something_ about how it looks for dependencies...


 That also means only setup.py that uses setuptools will have the
 --single-version-externally-managed option available.


 yup -- so I need to tack that on when needed, and can't just do it for all
 python packages...

 Thanks -- that does make things a bit more clear!

 -CHB





 Thanks,
 -- Ionel Cristian Mărieș, http://blog.ionelmc.ro

 On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker chris.bar...@noaa.gov
 wrote:

 On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș 
 cont...@ionelmc.ro wrote:

 ​Worth considering​, if you can afford it, to have a local patch that
 you apply before building. Then you have all the necessary fixes (like
 remove the setup_requires) in that patch file.


 yup -- that's a option -- but a really painful one!

 I did, in fact, find an incantation that works:

 $PYTHON setup.py install --single-version-externally-managed
 --record=/tmp/record.txt

 but boy, is that ugly, and hard to remember  why not a --no-deps flag?

 (and I have no idea what the --record thing is, or if it's even
 neccessary...

 -Chris


 This is a popular approach in Debian packages - they can have all kinds
 of fixes for the upstream code.



 Thanks,
 -- Ionel Cristian Mărieș, http://blog.ionelmc.ro




 --

 Christopher Barker, Ph.D.
 Oceanographer

 Emergency Response Division
 NOAA/NOS/ORR(206) 526-6959   voice
 7600 Sand Point Way NE   (206) 526-6329   fax
 Seattle, WA  98115   (206) 526-6317   main reception

 chris.bar...@noaa.gov





 --

 Christopher Barker, Ph.D.
 Oceanographer

 Emergency Response Division
 NOAA/NOS/ORR(206) 526-6959   voice
 7600 Sand Point Way NE   (206) 526-6329   fax
 Seattle, WA  98115   (206) 526-6317   main reception

 chris.bar...@noaa.gov




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș cont...@ionelmc.ro
wrote:

 ​Worth considering​, if you can afford it, to have a local patch that you
 apply before building. Then you have all the necessary fixes (like remove
 the setup_requires) in that patch file.


yup -- that's a option -- but a really painful one!

I did, in fact, find an incantation that works:

$PYTHON setup.py install --single-version-externally-managed
--record=/tmp/record.txt

but boy, is that ugly, and hard to remember  why not a --no-deps flag?

(and I have no idea what the --record thing is, or if it's even
neccessary...

-Chris


This is a popular approach in Debian packages - they can have all kinds of
 fixes for the upstream code.



 Thanks,
 -- Ionel Cristian Mărieș, http://blog.ionelmc.ro




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 6:57 AM, Daniel Holth dho...@gmail.com wrote:

 If that's what you want then we could say the spec was to put the
 requirements in setup_requires.txt, in the requirements.txt format,
 which pip would eventually look for and install before running
 setup.py


yes, that would be great -- and while we are at it, put the run-time
dependencies in requirements.txt too.

I brought this up a while ago, and it seems that requirements.txt is for
applications, and setting install_requires in the setup.py is for package
dependencies. But as we've seen, this creates problems -- so why not just
keep all the dependency info in an external file???

Though this would not be backward compatible with all the setup.pys out
there in the wild now...

-Chris




 On Thu, Mar 19, 2015 at 9:32 AM, Leonardo Rochael Almeida
 leoroch...@gmail.com wrote:
 
  On 18 March 2015 at 14:37, Daniel Holth dho...@gmail.com wrote:
 
  [...]
 
  The behavior we're aiming for would be:
 
  installer run setup.py - installs things
  python setup.py - does not install things
 
 
  Besides that, I'd add that we're also looking for: python setup.py (by
  itself) should not raise ImportError, even if setup.py needs extra things
  installed for certain operations (egg_info, build, sdist, develop,
 install).
 
  IMO, the biggest pain point is not people putting crazy stuff in
 setup.py to
  get version numbers. For me, the biggest pain point is when setup.py
 needs
  to import other packages in order to even know how to build:
 
  So I'd like to suggest the following series of small improvements to both
  pip and setuptools:
 
   * setuptools: `python setup.py setup_requires` dumps its setup_requires
  keyword in 'requirements.txt' format
 
  It's is already in this format, so should be trivial, but allows one to
 do
  something like:
 
  $ python setup.py setup_requires  setup_requires.txt
  $ pip install -r setup_requires.txt
 
  Or in one bash line:
 
  $ pip install -r ( python setup.py setup_requires )
 
   * setuptools: setup.py gains the ability to accept callables in most
 (all?)
  of its parameters.
 
  This will allow people to move all top level setup.py imports into
  functions, so that we can turn code like this:
 
  from setuptools import setup, Extension
  import numpy
 
  setup(ext_modules=[
  Extension(_cos_doubles,
  sources=[cos_doubles.c, cos_doubles.i],
  include_dirs=[numpy.get_include()])])
 
  Into this:
 
  from setuptools import setup, Extension
 
  def ext_modules():
  import numpy
  return [
  Extension(_cos_doubles,
  sources=[cos_doubles.c, cos_doubles.i],
  include_dirs=[numpy.get_include()])
  ]
 
  setup(ext_modules=ext_modules
setup_requires=['setuptools'])
 
   * pip: When working with an sdist, before running setup.py egg_info
 in a
  sandbox, pip would run setup.py setup_requires, install those packages
 in
  the sandbox (not in the main environment), then run egg_info, wheel,
  etc.
 
  Notice that the changes proposed above are all backward compatible,
 create
  no additional pain, and allow developers to move all top level setup.py
  craziness inside functions.
 
  After that, we can consider making setup.py not call the easy_install
  functionality when it finds a setup_requires keyword while running other
  commands, but just report if those packages are not available.
 
 
  PS: Yes, I've already proposed something similar recently:
  https://mail.python.org/pipermail/distutils-sig/2015-January/025682.html
 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Wed, Mar 18, 2015 at 10:37 AM, Daniel Holth dho...@gmail.com wrote:

 The behavior we're aiming for would be:

 installer run setup.py - installs things
 python setup.py - does not install things


yup.

Which, now that I look at it, is not so different than:

python setup.py build # does not isntall anything
python setup.py install # only install the particular package

pip install setup.py ( maybe not pass teh setup.py directly, but maybe? ) #
uses pip to find and install the dependencies.

and could we get there with:

python setup.py build --no-deps
python setup.py install --no-deps

(I'd like the no-deps flag to be the default, but that probably would have
to wait for a depreciation period)

None of this solves the how to get meta-data without installing the
package problem -- which I think is what started this thread.

For that, it seems the really hacky way to get there is to establish a
meta-data standard to be put in setup.py -- a bunch of  standard naems to
be defined in the module namespace:

packge_version = 1.2.3
setup_requires == [packagea, packageb=2.3,]
...
(or maybe all in a big dict:

package_meta_data = {package_version: 1.2.3
 setup_requires : [packagea, packageb=2.3,]
 ...
}

(those values would be passed in to setup() as well, of course)

That way, install tools, etc, could import teh setup.py, not run setup, and
have access to the meta data. Of course, this would only work with packages
that followed the standard, and it would be a long time until it was
common, but we've got to have a direction to head to.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Thu, Mar 19, 2015 at 7:12 AM, Daniel Holth dho...@gmail.com wrote:

 ... except that there are plenty of reasons we wouldn't want the
 requirements.txt format, mainly because pip shouldn't automatically
 install concrete dependencies that contain git:// urls etc.


is that format problem, or a pip feature issue?

and this is a one-way street -- setuptools would dump a list of
requirements -- would it ever HAVE a git:// url to dump?

-CHB



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Wed, Mar 18, 2015 at 9:02 AM, Ionel Cristian Mărieș cont...@ionelmc.ro
wrote:


 On Wed, Mar 18, 2015 at 5:33 PM, Chris Barker - NOAA Federal 
 chris.bar...@noaa.gov wrote:

 I don't want it downloading and installing dependencies when I go to
 build. That's an install- time task.


 Sounds to me like you should not use setup_requires then - if you don't
 like what it does.


My use case at the moment is trying to build conda packages from other
peoples' Python packages - if they use setup_requires, etc, then I'm stuck
with it.

Also -- for my packages, I want them to be easy to build and deploy by
others that aren't using conda -- so I need a way to do that - which would
be setuptools' features.

So I'd like the features of the official python packaging tools to
cleanly separated and not assume that if you're using setuptools you are
also using pip, etc


Also, for the whole distutils-sig, I don't understand all the fuss around
 this much maligned feature - there are plenty of options to manage
 build-time dependencies and tasks - one certainly doesn't need to shoehorn
 https://github.com/ipython/ipython/blob/master/setup.py a full blown
 build system into setup.py - there's make, invoke, shell scripts and plenty
 of other systems that can do that just fine​.


None of those are cross platform, though. That still may be the way to go.

I like to keep in mind that with all this pain, in fact, even raw distutils
is freaking awesome at making the easy stuff easy.

( pip and pypi too...)

i.e. I can write a simple C extnaion (or Cyton, even), and a very simple
boilerplate setup.py will let it build and install on all major platfroms
out of the box. Then I put it up on PyPi and anyone can do a pip install
my_package and away they go.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] setup_requires for dev environments

2015-03-19 Thread Chris Barker
On Wed, Mar 18, 2015 at 8:43 AM, Paul Moore p.f.mo...@gmail.com wrote:

  I suppose it's too late now, but the really painful parts of all this
  seem to be due to overly aggressive backward compatibility. We now
  have wheels, but also eggs, we now have pip, but also easy_install,
  etc.

 Agreed. But the problem we have here is that any system that fails to
 work for even a tiny proportion of packages on PyPI is a major issue.
 And we don't have *any* control over those packages - if they do the
 most insane things in their setup.py, and don't release a new version
 using new tools, we have to support those insane things, or deal with
 the bug reports.

 Maybe we should say sorry, your package needs to change or we won't
 help


Indeed -- I agree that it's key to support all the old kruft -- but it's
key to support that with the package manger / installation tool, i.e. pip.
We want pip install to just work for most of the packages already on PyPi
for sure.

But that doesn't mean that the newer-and-better setuptools needs to support
all the same old cruft. If it were called something different: (distribute?
;-) )

then folks couldn't simply replace:

from setuptool simport setup
with
from distribute import setup

and be done, but they would only make that change if they wanted to make
that change.

Of course, then we'd be supporting both setuptools and distribute, and
having to make sure that pip (and wheel) worked with both... so maybe just
too much a maintenance headache, but breaking backward compatibility gets
you a way forward that keeping it does not (py3 anyone?)

I suppose the greater danger is that every feature in setuptools is there
because someone wanted it -- so it would be easy for the new thing to
grow all the same kruft



 that way (see, for example, the distribute or distutils2 flamewars).


IIRC, distribute was always imported as setuptools -- so born to create
strife and/or accumulate all the same kruft.

I guess I have no idea if there was a big problem with the architecture of
setuptools requiring a big shift -- all I see are problems with the API and
feature set.and by definition you can't change those and be backward
compatible...


 Agreed entirely. It's a long slow process though to migrate away from
 the problems of setuptools without losing the great features at the
 same time...


That slog is MUCH longer and harder if you need to keep backward
compatibility though.

But I suppose the alternative is to build something no one uses!

Is there any move to have a deprecation process for setuptools features?

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] force static linking

2015-03-23 Thread Chris Barker
On Mon, Mar 23, 2015 at 11:45 AM, Dan Stromberg drsali...@gmail.com wrote:

 Is this the general perspective on static linking of python module
 dependencies?  That if your systems are the same, you don't need to?


That's general -- nothing specific to python here.

There _may_ be a difference in that you might be more likely to want to
distribute a binary python module, and no be sure of the level of
compatibility of the host sytem -- particularly if you use a non-standard
or not-comon lib, or one you want built a particular way -- like ATLAS,
BLAS, etc...

I want static linking too, but if it's swimming upstream in a fast
 river, I may reconsider.


well it's a slow river...

The easiest way is to make sure that you only have the static version of
the libs on the system you build on. You may be able to do that by passing
something like --disable-shared to configure, or you can just kludge it and
delete the shared libs after you build and install.

-Chris



 Thanks.

 On Mon, Mar 23, 2015 at 11:41 AM, Bill Deegan b...@baddogconsulting.com
 wrote:
  Gordon,
 
  If you are sure that your dev and production environments match, then you
  should have same shared libraries on both, and no need for static
 linkage?
 
  -Bill
 
  On Mon, Mar 23, 2015 at 11:36 AM, Dan Stromberg drsali...@gmail.com
 wrote:
 
  On Thu, Sep 11, 2014 at 5:28 AM, gordon wgordo...@gmail.com wrote:
   Hello,
  
   I am attempting to build statically linked distributions.
  
   I am using docker containers to ensure the deployment environment
   matches
   the build environment so there is no compatibility concern.
  
   Is there any way to force static linking so that wheels can be
 installed
   into a virtual env without requiring specific packages on the host?
 
  Maybe pass -static in $LDFLAGS?  Just a wild guess really.
  ___
  Distutils-SIG maillist  -  Distutils-SIG@python.org
  https://mail.python.org/mailman/listinfo/distutils-sig
 
 
 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-13 Thread Chris Barker
On Mon, Apr 13, 2015 at 1:19 PM, David Cournapeau courn...@gmail.com
wrote:

 This is what we use on top of setuptools egg:

  - ability to add dependencies which are not python packages (I think most
 of it is already handled in metadata 2.0/PEP 426, but I would have to
 re-read the PEP carefully).
  - ability to run post/pre install/remove scripts
  - support for all the of the autotools directories, with sensible
 mapping on windows


Are these inside or outside the python installation? I'm more than a bit
wary of a wheel that would install stuff outside of the sandbox of the
python install.


The whole reason I started this discussion is to make sure wheel has a
 standard way to do what is needed for those usecases.

 conda, rpm, deb, or eggs as used in enthought are all essentially the
 same: an archive with a bunch of metadata. The real issue is standardising
 on the exact formats. As you noticed, there is not much missing in the
 wheel *spec* to get most of what's needed.


hmm -- true. I guess where it seems to get more complicated is beyond the
wheel (or conda, or...) package itself, to the dependency management,
installation tools, etc.

But perhaps you are suggesting that we can extend wheel to support a bt
more stuff, and leave the rest of the system as separate problem? i.e.
Canopy can have it's own find, install, manage-dependency tool, but that it
can use the wheel format for the packages themselves?

I don't see why not

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Proper handling of PEP420 namespace packages with setuptools and pip

2015-04-23 Thread Chris Barker
On Wed, Apr 22, 2015 at 5:31 PM, Donald Stufft don...@stufft.io wrote:

 This seems SO SIMPLE.

 ...

 What am I missing?

 Prior to PEP 420 you needed the dynamic path stuff because sometimes your
 namespace package is split across multiple locations on sys.path.


OK -- sure you'd need it then -- still not sure why we need to scatter
namespace packages all of the file system though -- or why we couldn't do
it the quick and dirty and easy way for most cases, anyway...

PEP 420 more or less solves all of the problems with namespace packages,
 other than it’s a Python 3 only feature so most people aren’t going to be
 willing to depend on it.


yeah there's always that -- maybe I'll revisit namespace packages when I've
made the full transition to py3...

On Thu, Apr 23, 2015 at 4:38 PM, Barry Warsaw ba...@python.org wrote:

 This gets at the heart of the motivation for PEP 420, at least for me with
 my
 distro-developer hat on.  Any single file can only be owned by exactly one
 distro-level package.


I see -- that' a pain. though it seems to me to be a limitation of the
distro-packaging systems, not a python one -- though maybe we need to work
with it anyway...And distro packages need shared directories already --
seems like a lot of work for an empty __init__.py ;-)

Thanks for the clarification.

-CHB

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Proper handling of PEP420 namespace packages with setuptools and pip

2015-04-22 Thread Chris Barker
A note from the peanut gallery:

I  like the idea of namepace packages, but every time I've tried to use
them, I've been stymied -- maybe this PEP will solve that, but...

First -  the issues:

- It somehow seems like a lot of work, details to get right, and
more-than-one-way-to-do-it. But maybe that's all pre- PEP 420

- Last time I tried, I couldn't get them to work with setup.py develop

But at the core of this -- Why does it have to be so hard? It seems very
simple to me -- what am I missing?

What are namespace packages? To me, they are a package that serves no other
purpose than to provide a single namespace in which to put other packages.
This makes a lot of sense if you have a bunch of related packages where
users may only require one, or a couple, but not all. And you want to be
able to maintain them, and version control them independently.

But it seem to get this, all we need is:

1) A directory with the top-level name

2) It has an (empty)  __init__.py (so it is a python package)

3) It has other directories in it -- each of these are regular old python
packages -- the ONLY difference is that they are installed under that name

That's it. Done. Now all we need is a way to install these things -- well
that's easy, each sub-package installs itself just like it would, maybe
overwriting the top level directory name and the __init__.py, if another
sub-package has already installed it. But that's OK, because the name is by
definition the same, and the __init__ is empty.

This seems SO SIMPLE. No declaring things all over the place, no dynamic
path manipulation, nothing unusual at all, except the ability to install a
module into a dir without clobbering what might already be in that dir.

What am I missing?

-Chris





-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 12:04 PM, David Mertz dme...@continuum.io wrote:

 Continuum has a great desire to make 'pip' work with conda packages.
 Obviously, we love for users to choose the Anaconda Python distribution but
 many will not for a variety of reasons (many good reasons).


Hmm -- this strikes me as very, very , tricky -- and of course, tied in to
the other thread I've been spending a bunch of time on...

However, we would like for users of other distros still to be able to
 benefit from our creation of binary packages for many platforms in the
 conda format.


Frankly, if you want your efforts at building binaries to get used outside
of Anaconda, then you shoudl be building wheels in the first place. While
conda does more than pip + wheel can do -- I suppose you _could_ use wheels
for the things it can support..

But on to the technical issues:

conda python packages depend on other conda packages, and some of those
packages are not python packages at all. The common use case here are
non-python dynamic libs -- exactly the use case I've been going on in the
other thread about...

And conda installs those dynamic libs in a conda environment -- outside of
the python environment. So you can't really use a conda package without a
conda enviroment, and an installer that understands that environment (I
think conda install does some lib path re-naming, yes?), i.e. conda itself.
So I think that's kind of a dead end.

So what about the idea of a conda-package-to-wheel converter? conda
packages an wheels have a bit in common -- IIUC, they are both basically a
zip of all the files you need installed. But again the problem is those
dependencies on third party dynamic libs.

So far that to work -- pip+wheel would have to grow a way to deal with
installing, managing and using dynamic libs. See the other thread for the
nightmare there...

And while I'd love to see this happen, perhaps an easier route would be for
conda_build to grow a static flag that will statically link stuff and get
to somethign already supported by pip, wheel, and pypi.

-Chris



 It is true that right now, a user can in principle type:

   % pip install conda
   % conda install some_conda_package

 But that creates two separate systems for tracking what's installed and
 what dependencies are resolved;


Indeed -- which is why some folks are working on making it easier to use
conda for everythingconverting a wheel to a conda package is probably
easier than the other way around..

Funny -- just moments ago I wrote that it didn't seem that anyone other
than me was interested in extending pip_wheel to support this kind of thing
-- I guess I was wrong!

Great to see you and continuum thinking about this.


-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-16 Thread Chris Barker
On Fri, May 15, 2015 at 11:35 PM, David Cournapeau courn...@gmail.com
wrote:

 On Sat, May 16, 2015 at 4:56 AM, Chris Barker chris.bar...@noaa.gov
 wrote:


 But in short -- I'm pretty sure there is a way, on all systems, to have a
 standard way to build extension modules, combined with a standard way to
 install shared libs, so that a lib can be shared among multiple packages.
 So the question remains:


 There is actually no way to do that on windows without modifying the
 interpreter somehow.


Darn.


 This was somehow discussed a bit at PyCon when talking about windows
 packaging:

  1. the simple way to share DLLs across extensions is to put them in the
 %PATH%, but that's horrible.


yes -- that has to be off the table, period.


 2. there are ways to put DLLs in a shared directory *not* in the %PATH%
 since at least windows XP SP2 and above, through the SetDllDirectory API.

 With 2., you still have the issue of DLL hell,


could you clarify a bit -- I thought that this could, at least, put a dir
on the search path that was specific to that python context. So it would
require cooperation among all the packages being used at once, but not get
tangled up with the rest of the system. but maybe I'm wrong here -- I have
no idea what the heck I'm doing with this!

which may be resolved through naming and activation contexts.


I guess that's what I mean by the above..


 I had a brief chat with Steve where he mentioned that this may be a
 solution, but he was not 100 % sure IIRC. The main drawback of this
 solution is that it won't work when inheriting virtual environments (as you
 can only set a single directory).


no relative paths here? or path that can be set at run time? or maybe Im
missing what inheriting virtual environments means...


 FWIW, we are about to deploy 2. @ Enthought (where we control the python
 interpreter, so it is much easier for

us).


It'll be great to see how that works out, then. I take that this means that
for Canopy, you've decided that statically linking everything is NOT The
way to go. Which is a good data point to have.

Thanks for the update.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 4:13 AM, Paul Moore p.f.mo...@gmail.com wrote:

  Though it's a lot harder to provide a build environment than just the
 lib to
  link too .. Im going to have to think more about that...

 It seems to me that the end user doesn't really have a problem here
 (pip install matplotlib works fine for me using the existing wheel).


Sure -- but that's because Matthew Brett has done a lot of work to make
that happen.

 It's the package maintainers (who have to build the binaries) that
 have the issue because everyone ends up doing the same work over and
 over, building dependencies.


Exactly -- It would be nice if the ecosystem made that easier.


 So rather than trying to address the hard
 problem of dynamic linking, maybe a simpler solution is to set up a
 PyPI-like hosting solution for static libraries of C dependencies?

 It could be as simple as a github project that contained a directory
 for each dependency,


I started that here:

https://github.com/PythonCHB/mac-builds

but haven't kept it up. And Matthew Brett has done most of the work here:

https://github.com/MacPython

not sure how he's sharing the static libs -- but it could be done.

 With a setuptools build plugin you could even just

specify your libraries in setup.py, and have the plugin download the
 lib files automatically at build time.


actually, that's a pretty cool idea! you'd need  place to host them --
gitHbu is no longer hosting downloads are they? though you could probably
use github-pages.. (or somethign else)


 People add libraries to the
 archive simply by posting pull requests. Maybe the project maintainer
 maintains the actual binaries by running the builds separately and
 publishing them separately, or maybe PRs include binaries


or you use a CI system to build them. Something like this is being done by
a bunch of folks for conda/binstar:

https://github.com/ioos/conda-recipes

is just one example.

PS The above is described as if it's single-platform, mostly because I

only tend to think about these issues from a Windows POV, but it
 shouldn't be hard to extend it to multi-platform.


Indeed -- the MacWheels projects are, of course single platform, but could
be extended. though at the end of the day, there isn't much to share
between building libs on different platforms (unless you are using a
cross-platfrom build tool -- why I was trying out gattai for my stuff)

The conda stuff is multi-platform, though, in fact, you have to write a
separate build script for each platform -- it doesn't really provide
anything to help with that part.

But while these efforts are moving towards removing the need for every
pacakge maintainer to build the deps -- we are now duplicating the effort
of trying to remove duplication of effort :-) -- but maybe just waiting for
something to gain momentum and rise to the top is the answer.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 10:12 AM, Nick Coghlan ncogh...@gmail.com wrote:

  Maybe, but it's a problem to be solved, and the Linux distros more or
 less
  solve it for us, but OS-X and Windows have no such system built in (OS-X
  does have Brew and macports)

 Windows 10 has Chocalatey and OneGet:

 * https://chocolatey.org/
 *
 http://blogs.msdn.com/b/garretts/archive/2015/01/27/oneget-and-the-windows-10-preview.aspx


cool -- though I don't think we want the official python to depend on a
third party system, and one get won't be available for most users for a
LONG time...

The fact that OS-X users have to choose between fink, macport, homebrew or
roll-your-own is a MAJOR soruce of pain for supporting the OS-X community.
More than one way to do it is not the goal.

conda and nix then fill the niche for language independent packaging
 at the user level rather than the system level.


yup -- conda is, indeed, pretty cool.

I think there is a bit of fuzz here -- cPython, at least, uses the the

  operating system provided C/C++
  dynamic linking system -- it's not a totally independent thing.

 I'm specifically referring to the *declaration* of dependencies here.


sure -- that's my point about the current missing link -- setuptools,
pip, etc, can only declare python-package-level dependencies, not
binary-level dependencies.

My idea is to bundle up a shared lib in a python package -- then, if you
declare a dependency on that package, you've handles the dep issue. The
trick is that a particular binary wheel depends on that other binary wheel
-- rather than the whole package depending on it. (that is, on linux, it
would have no dependency, on OS-X it would -- but then only the wheel built
for a non-macports build, etc).

I think we could hack around this by monkey-patching the wheel after it is
built, so may be worth playing with to see how it works before proposing
any changes to the ecosystem.

 And if you are using something like conda you don't need pip

 or wheels anyway!

 Correct, just as if you're relying solely on Linux system packages,
 you don't need pip or wheels. Aside from the fact that conda is
 cross-platform, the main difference between the conda community and a
 Linux distro is in the *kind* of software we're likely to have already
 done the integration work for.


sure. but the cross-platform thing is BIG -- we NEED pip and wheel because
rpm, or deb, or ... are all platform and distro dependent -- we want a way
for package maintainers to support a broad audience without having to deal
with 12 different package systems.

The key to understanding the difference in the respective roles of pip
 and conda is realising that there are *two* basic distribution
 scenarios that we want to be able to cover (I go into this in more
 detail in
 https://www.python.org/dev/peps/pep-0426/#development-distribution-and-deployment-of-python-software
 ):


hmm -- sure, they are different, but is it impossible to support both with
one system?


 * software developer/publisher - software integrator/service operator
 (or data analyst)
 * software developer/publisher - software integrator - service
 operator (or data analyst)

...

 On the consumption side, though, the nature of the PyPA tooling as a
 platform-independent software publication toolchain means that if you
 want to consume the PyPA formats directly, you need to be prepared to
 do your own integration work.


Exactly! and while Linux system admins can do their own system integration
work, everyday users (and many Windows sys admins) can't, and we shouldn't
expect them to.

And, in fact, the PyPA tooling does support the more casual user much of
the time -- for example, I'm in the third quarter of a Python certification
class -- Intro, Web development, Advanced topics -- and only half way
through the third class have I run into any problems with sticking with the
PyPA tools.

(except for pychecker -- not being on Pypi :-( )

Many public web service developers are
 entirely happy with that deal, but most system administrators and data
 analysts trying to deal with components written in multiple
 programming languages aren't.


exactly -- but it's not because the audience is different in their role --
it's because different users need different python packages. The PyPA tools
support pure-python great -- and compiled extensions without deps pretty
well -- but there is a bit of gap with extensions that require other deps.

It's a 90% (95%) solution... It'd be nice to get it to a 99% solution.

Where is really gets ugly is where you need stuff that has nothing to do
with python -- say a Julia run-time, or ...

Anaconda is there to support that: their philosophy is that if you are
trying to do full-on data analysis with python, you are likely to need
stuff strickly beyond the python ecosystem -- your own Fortran code, numpy
(which requires LLVM), etc.

Maybe they are right -- but there is still a heck of a lot of stuff that
you can do and stay 

Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 11:54 AM, Paul Moore p.f.mo...@gmail.com wrote:

  could you clarify a bit -- I thought that this could, at least, put a
 dir on
  the search path that was specific to that python context. So it would
  require cooperation among all the packages being used at once, but not
 get
  tangled up with the rest of the system. but maybe I'm wrong here -- I
 have
  no idea what the heck I'm doing with this!

 Suppose Python adds C:\PythonXY\SharedDLLs to %PATH%. Suppose there's
 a libpng.dll in there, for matplotlib.


I think we all agree that %PATH% is NOT the option! Taht is the key source
od dll hell on Windows.

I was referring to the SetDllDirectory API. I don't think that gets picked
up by other processes.

from:

https://msdn.microsoft.com/en-us/library/windows/desktop/ms686203%28v=vs.85%29.aspx

It looks like you can add a path, at run time, that gets searched for dlls
before the rest of the system locations. And this does to effect any other
applications. But you'd need to make sure this got run before any of the
effected packages where loaded -- which is proabbly what David meant by
needing to control the python binary.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-14 Thread Chris Barker
On Thu, May 14, 2015 at 4:41 PM, Robert Collins robe...@robertcollins.net
wrote:

  anyway) you now have three copies of the same lib (but maybe different
  versions) all linked into your executable. Maybe there is no downside to
  that (I haven't had a problem yet), but it seems like a bad way to do it!

 If they are exchanging data structures, it will break at some point.
 Consider libpng; say that you have a handle to the native C struct for
 it in PIL, and you pass the wrapping Python object for it to
 Matplotlib but the struct changed between the version embedded in PIL
 and that in Matplotlib. Boom.

 If you communicate purely via Python objects that get remarshalled
 within each lib its safe (perhaps heavy on the footprint, but safe).


As far as I know -- no one tries to pass, say, libpng structure pointers
around between different python packages. You are right, that would be
pretty insane! the best you can do is pass python buffer objects around so
you are not copying data where you don't need to.

But maybe there is a use-case for passing a native lib data structure
around, in which case, yes, you'd really want the lib versions to match --
yes! I suppose if I were to do this, I'd do a run-time check on the lib
version number... not sure how else you could be safe in Python-land.

So maybe the only real downside is some wasted disk space an memory, which
are pretty cheap thee days -- but I stil l don't like it ;-)

But the linker/run time/whatever can keep track of which version of a given
function is called where?

-CHB





 -Rob


 --
 Robert Collins rbtcoll...@hp.com
 Distinguished Technologist
 HP Converged Cloud




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-14 Thread Chris Barker

  I'm confused -- you don't want a system to be able to install ONE
 version
  of a lib that various python packages can all link to? That's really the
  key use-case for me



 Are we talking about Python libraries accessed via Python APIs, or
 linking to external dependencies not written in Python (including
 linking directly to C libraries shipped with a Python library)?


I, at least, am talking about the latter. for a concrete example: libpng,
for instance, might be needed by PIL, wxPython, Matplotlib, and who knows
what else. At this point, if you want to build a package of any of these,
you need to statically link it into each of them, or distribute shared libs
with each package -- if you ware using them all together (which I do,
anyway) you now have three copies of the same lib (but maybe different
versions) all linked into your executable. Maybe there is no downside to
that (I haven't had a problem yet), but it seems like a bad way to do it!

It's the latter I consider to be out of scope for a language specific
 packaging system


Maybe, but it's a problem to be solved, and the Linux distros more or less
solve it for us, but OS-X and Windows have no such system built in (OS-X
does have Brew and macports)


 - Python packaging dependencies are designed to
 describe inter-component dependencies based on the Python import
 system, not dependencies based on the operating system provided C/C++
 dynamic linking system.


I think there is a bit of fuzz here -- cPython, at least, uses the the
operating system provided C/C++
dynamic linking system -- it's not a totally independent thing.

If folks are after the latter, than they want
 a language independent package system, like conda, nix, or the system
 package manager in a Linux distribution.


And I am, indeed, focusing on conda lately for this reason -- but not all
my users want to use a whole new system, they just want to pip install
and have it work. And if you are using something like conda you don't need
pip or wheels anyway!


 I'm arguing against supporting direct C level dependencies between
 packages that rely on dynamic linking to find each other rather than
 going through the Python import system,


Maybe there is a mid ground. For instance, I have a complex wrapper system
around a bunch of C++ code. There are maybe 6 or 7 modules that all need to
link against that C++ code. On OS-X (and I think Linux, I haven't been
doing those builds), we can statically link all the C++ into one python
module -- then, as long as that python module is imported before the
others, they will all work, and all use that same already loaded version of
that library.

(this doesn't work so nicely on Windows, unfortunately, so there, we build
a dll, and have all the extensions link to it, then put the dll somewhere
it gets found -- a little fuzzy on those details)

So option (1) for something like libpng is to have a compiled python module
that is little but a something that can be linked to ibpng, so that it can
be found and loaded by cPython on import, and any other modules can then
expect it to be there. This is a big old kludge, but I think could be done
with little change to anything in Python or wheel, or...but it would
require changes to how each package that use that lib sets itself up and
checks for and install dependencies -- maybe not really possible. and it
would be better if dependencies could be platform independent, which I'm
not sure is supported now.

option (2) would be to extend python's import mechanism a bit to allow it
to do a raw link in this arbitrary lib action, so the lib would not have
to be wrapped in a python module -- I don't know how possible that is, or
if it would be worth it.


  (Another way of looking at this: if a tool can manage the
 Python runtime in addition to Python modules, it's a full-blown
 arbitrary software distribution platform, not just a Python package
 manager).


sure, but if it's ALSO a Python package manger, then why not? i.e. conda --
if we all used conda, we wouldn't need pip+wheel.


 Defining cross-platform ABIs (cf. http://bugs.python.org/issue23966)


This is a mess that you need to deal with for ANY binary package -- that's
why we don't distribute binary wheels on pypi for Linux, yes?


 I'm
 firmly of the opinion that trying to solve both sets of problems with
 a single tool will produce a result that doesn't work as well for
 *either* use case as separate tools can.


I'm going to point to conda again -- it solves both problems, and it's
better to use it for all your packages than mingling it with pip (though
you CAN mingle it with pip...). So if we say pip and friends are not going
to do that, then we are saying: we don't support a substantial class of
packages, and then I wonder what the point is to supporting binary packages
at all?

P.S. The ABI definition problem is at least somewhat manageable for
 Windows and Mac OS X desktop/laptop environments


Ah -- here is a key point -- because of 

Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-15 Thread Chris Barker
On Fri, May 15, 2015 at 1:44 PM, Paul Moore p.f.mo...@gmail.com wrote:

  Is there any point? or is the current approach of statically linking all
  third party libs the way to go?

 If someone can make it work, that would be good. But (a) nobody is
 actually offering to develop and maintain such a solution,


well, it's on my list -- but it has been for a while, so I'm trying to
gauge whether it's worth putting at the top of my things to do for python
list. It's not at the top now ;-)


 and (b)
 it's not particularly clear how *much* of a benefit there would be
 (space savings aren't that important, ease of upgrade is fine as long
 as everything can be upgraded at once, etc...)


hmm -- that may be a trick, though not a uncommon one in python package
dependencies -- it maybe hard to have more than one version of a given lib
installed

 If so, then is there any chance of getting folks to conform to this
 standard
  for PyPi hosted binary packages anyway? i.e. the curation problem.

 If it exists, and if there's a benefit, people will use it.


OK -- that's encouraging...


  Personally, I'm on the fence here -- I really want newbies to be able to
  simply pip install as many packages as possible and get a good result
 when
  they do it.

 Static linking gives that on Windows FWIW. (And maybe also on OSX?)
 This is a key point, though - the goal shouldn't be use dynamic
 linking but rather make the user experience as easy as possible. It
 may even be that the best approach (dynamic or static) differs
 depending on platform.


true -- though we also have another problem -- that static linking solution
is actually a big pain for package maintainers -- building and linking the
dependencies the right way is a pain -- and now everyone that uses a given
lib has to figure out how to do it. Giving folks a dynamic lib they can use
would mie it easier for tehm to build their packages -- a nice benifit
there.

Though it's a lot harder to provide a build environment than just the lib
to link too .. Im going to have to think more about that...


  On the other hand, I've found that conda better supports this right now,
 so
  it's easier for me to simply use that for my tools.

 And that's an entirely reasonable position. The only problem (if
 indeed it is a problem) is that by having two different solutions
 (pip/wheel and conda) splits the developer resource, which means that
 neither approach moves forward as fast as a combined approach does.


That's not the only problem -- the current split between the (more than
one) scientifc python distributions, and the community of folks using
python.org and pypi creates a bit of a mess for newbies.

I'm reviving this conversation because i just spent a class lecture in a
python class on numpy/scipy -- these students have been using a python
install for months, using virtualenv, ip installing whatever they need, et.

and now, to use another lib, they have to go through machination, maybe
even installing a entire additional python. This is not good. And I've had
to help more than one student untangle a mess of Apple Python python.org
python, homebrew, and/or Anaconda -- for someone that doesn't really get
python pacakging, never mond PATHS, and .bashrc vs .bash_profile, etc, it's
an unholy mess.

There should be one-- and preferably only one --obvious way to do it. --
HA!


But that's OK if the two solutions are addressing different needs


The needs aren't really that different, however. Oh well.

Anyway, it seems like if I can find some time to prototype what I have in
mind, there may be some room to make it official if it works out. If anyone
else want to help -- let me know!

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-17 Thread Chris Barker
On Sun, May 17, 2015 at 5:12 PM, Robert Collins robe...@robertcollins.net
wrote:

   But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ???
  
   I see pip has handling the dependency resolution, and finding and
 downloading of packages part of the problem -- conda does those already.
  
   So what would using pip inside a conda build script buy you that using
 setuptools does not?
 
  Indirection via pip injects the usage of setuptools even for plain
 distutils projects, and generates
 https://www.python.org/dev/peps/pep-0376/ compliant metadata by default.
 
  However, looking at the current packaging policy, I think I
 misremembered the situation - it looks like we *discussed* recommending
 indirection via pip  attaining PEP 376 compliance, but haven't actually
 moved forward with the idea yet. That makes sense, since pursuing it would
 have been gated on ensurepip, and the Python 3 migration has been higher
 priority recently.

 That glue is actually very shallow...I think we should rip it out of pip
 and perhaps put it in setuptools. It's about building, not installing.


+1 -- and rip out setuptools installing of dependencies, while we're at it
:-)

(OK, I  know e can't do that...)

But is the up shot that using pip to install won't buy anythign over
setuptools right now? (except for packages that aren't already using
setuptools...)

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-17 Thread Chris Barker
Trying to keep this brief, because the odds of my finding time to do much
with this are slim..

 I'm not proposing that we drop it -- just that we push pip and wheel a
bit farther to broaden the supported user-base.

 I can't stop you working on something I consider a deep rabbithole,

no -- but I do appreciate your assessment of how deep that hole is -- you
certainly have a while lot more background with all this than I do -- I
could well be being very naive here.

 but why not just recommend the use of conda, and only pubish sdists on
 PyPI? conda needs more users and contributors seeking better integration
 with the PyPA tooling, and minimising the non-productive competition.

I essentially where two hats here:

1) I produce software built on top of the scientific python stack, and I
want my users to have an easy experience with installing and running my
code. For that -- I am going the conda route. Im not there yet, but close
to being able to say:

a) install Anaconda
b) add my binstar channel to your conda environment
c) conda install my_package

The complication here is that we also have a web front end for our
computational code, and it makes heavy use of all sorts of web-oriented
packages that are not supported by Anaconda or, for the most part, the
conda community (binstar). My solution is to make conda packages myself of
those and put them in my binstar channel. The other option is to piip
install those packages, but then you get pretty tangled up in dependencies
ans conda environments, vs viirtual environments, etc...

2) Hat two is an instructor for the University of Washington Continuing
Education Program's Python Certification. In that program, we do very
little with the scipy stack, but have an entire course on web development.
And the instructor of that class, quite rightly, pushes the standard of
practice for web developers: heavy use of virtualenv and pip.

Oh, and hat (3) is a  long time pythonista, who, among other things, has
been working for years to make using python easier to use on the Mac for
folks that don't know or care what the unix command line is

I guess the key thing here for me is that I don't see pushing conda to
budding web developers -- but what if web developers have the need for a
bit of the scipy stack? or???

We really don't have a good solution for those folks.

 The web development folks targeting Linux will generally be in a position
 to build from source (caching the resulting wheel file, or perhaps an
 entire container image).

again, I'm not concerned about linux -- it's an ABI nightmare, so we really
don't want to go there, and its users are generally more sophisticated a
little building is not a big deal.


 It's also worth noting that one of my key intended use cases for metadata
 extensions is to publish platform specific external dependencies in the
 upstream project metadata, which would get us one step closer to fully
 automated repackaging into policy compliant redistributor packages.


Honestly, I don't follow this! -- but I'll keep an eye out for it - sounds
useful.


 The existence of tight ABI coupling between components both gives the
 scientific Python stack a lot of its power, *and* makes it almost as hard
 to distribute in binary form as native GUI applications.


I think harder, actually  :-)

 * No one else seems to think it's worth trying to extend the PyPa
ecosystem a bit more to better support dynamic libs. (except _maybe_
Enthought?)

 I know Donald is keen to see this, and a lot of ideas become more feasible
 if(/when?) PyPI gets an integrated wheel build farm. At that point, we can
 use the centre of gravity approach by letting the build farm implicitly
 determine the standard version of particular external dependencies, even
 if we can't communicate those versions effectively in the metadata.

That's more what I'm thinking, yes.

 * I still think it can be done with minimal changes, and hacked in to do
the proof of concept

I'm still not clear on what it is. I've been pointing out how hard it is
to do this right in the general case, but I get the impression you're
actually more interested in the narrower case of defining a SciPy ABI
that encompasses selected third party binary dependencies.

I wouldn't say SciPyABI -- that, in a way is already being done -- folks
are coordinating the official bianries of at least the core scipy stack
-- it's a pain -- no Windows wheels for numpy, for instance (though I think
they are close)
My interest is actually taking it beyond that -- honestly in my case there
are only a handful of libs that I'm aware of that get common use, for
instance libfreetype and libpng in wxPython, PIL, matplotlib, etc.

If I were only SciPy focused -- conda would be the way to go. That's part
ofteh problem I see -- there are split communities, but they DO overlap, I
thin ti's a diservice to punt thes issues of to individual sub-communities
to address on their own.

 * But I'm not sure it's something that's going to 

Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-17 Thread Chris Barker
On Sun, May 17, 2015 at 12:05 AM, Nick Coghlan ncogh...@gmail.com wrote:

% pip install --upgrade pip
% pip install some_conda_package

 This gets the respective role of the two tools reversed - it's like my
 asking for pip install some_fedora_rpm to be made to work.


I agree here -- I was thinking there was some promise in a
conda_package_to_wheel converter though. It would, of course, only work in
a subset of conda packages, but would be nice.

The trick is that conda packages for the hard-to-build python packages (the
ones we care about) often (always?) depend on conda packages for dynamic
libs, and pip+wheel have no support for that.

And this is a trick, because while I have some ideas for supporting
just-for-python dynamic libs, conda's are not just-for-python -- so that
might be hard to mash together.

Continuum has a bunch of smart people, though.

However, having conda use pip install in its build scripts so that
 it reliably generates pip compatible installation metadata would be a
 possibility worth discussing - that's what we've started doing in
 Fedora, so that runtime utilities like pkg_resources can work
 correctly.


Hmm -- that's something ot look into -- you can put essentially anything
into a conda bulid script --  so this would be a matter of convention,
rather than tooling. (of course the conventions used by Continuum for the
offical conda packages is the standard).

But I'm confused as to the roles of pip vs setuptools, vs wheel, vs ???

I see pip has handling the dependency resolution, and finding and
downloading of packages part of the problem -- conda does those already.

So what would using pip inside a conda build script buy you that using
setuptools does not?

And would this be the right incantation to put in a build script:

pip install --no-deps ./

(if you are in the package's main dir -- next to setup.py)

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-18 Thread Chris Barker
On Mon, May 18, 2015 at 11:21 AM, Paul Moore p.f.mo...@gmail.com wrote:

 On 18 May 2015 at 18:50, David Mertz dme...@continuum.io wrote:
% pip install conda
% conda install scientific_stuff
% conda install --sdist django_widget   # we know to look on PyPI

 But that doesn't give (Windows, mainly) users a solution for things
 that need a C compiler, but aren't provided as conda packages.


Conda provides (or can) a C compiler (some versions of gcc). It was buggy
last time I checked, but it's doable.


 My honest view is that unless conda is intending to replace pip and
 wheel totally, you cannot assume that people will be happy to use
 conda alongside pip (or indeed, use any pair of independent packaging
 tools together - people typically want one unified solution). And if
 the scientific community stops working towards providing wheels for
 people without compilers because you can use conda, there is going
 to be a proportion of the Python community that will lose out on some
 great tools as a result.


Exactly -- this idea that there are two (or more) non-overlapping
communities is pretty destructive.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-18 Thread Chris Barker
A member of the conda dev team could answer this better than I, but I've
used enough to _think_ I understand the basics:

On Mon, May 18, 2015 at 3:30 AM, Paul Moore p.f.mo...@gmail.com wrote:

 One way forward in terms of building wheels is to use any build
 process you like to do an isolated build (I think it's --root that
 distutils uses for this sort of thing) and then use distlib to build a
 wheel from the resulting directory structure (or do it by hand, it's
 not much more than a bit of directory rearrangement and zipping things
 up).

 That process can be automated any way you like - although ideally via
 something general, so projects don't have to reinvent the wheel every
 time.


sure -- you can put virtually anything in a conda build script. what conda
build does is more or less:

* setup  an isolated environment with some handy environment variables for
things like the python interpreter, etc.

* run your build script

* package up whatever got built.

If processes like conda then used wheels as their input for building
 packages, the wheels could *also* be published


I'm not sure it's any easier to build a wheel, then make a conda package
out of it, than to build a conda package, and then make a wheel out of it.
Or have your build scrit build a wheel, and then independently build a
conda package.

In any case, the resulting wheel would depend on an environment like the
one set up by conda build -- and that is an environment with all the
dependencies installed -- which is where this gets ugly.

[remember, making the wheel itself it the easy part]


 not least, does the
 way conda uses shared libraries make going via wheels impossible (or
 at least make the wheels unusable without conda's support for
 installing non-Python shared libraries)?


Pretty much, yes. conda provides a way to package up and manage arbitrary
stuff -- in this case, that would be non-python dependencies -- i.e. shared
libs.

So you can say that my_python_package depends on this_c_lib, and as long as
you, or someone else has made a conda package for this_c_lib, then all is
well.

But python, setuptools, pip, wheel, etc. don't have a way to handle that
shared lib as a dependency -- no standard way where to put it, no way to
package it as a wheel, etc.

So the way to deal with this with wheels is to statically link everything.
But that's not how conda pa cakges are built, so no way to leverage conda
here.

We need to remember what leveraging conda would buy us:

conda doesn't actually make it any easier to build anything -- you need a
platform-specific build script to build a conda package.

conda does provide a way to manage non-python dependencies -- but that
doesn't buy you anything unless you are using conda to manage your system
anyway.

conda DOES provide a community of people figuring out how to build complex
packages, and building them, and putting them up for public dissemination.

So the thing that leveraging conda can do is reduce the need for a lot of
duplicated effort. And that effort is almost entirely about those third
part libs -- after all, a compiled extension that has no dependencies is
easy to build and put on PyPi. (OK, there is still a bit of duplicated
effort in making the builds themselves on multiple platforms -- but with CI
systems, that's not huge)

An example:

I have a complex package that not depends on all sorts of hard-to-build
python packages, but also has its own C++ code that depends on the netcdf4
library. Which in turn, depends on the hdf5 lib, which depends on libcurl,
and zlib, and (I think one or two others).

Making binary wheels of this requires me to figure out how to build all
those deps on at least two platforms (Windows being the nightmare, but OS-X
is not trivial, too, if I want it to match the python.org build, and
support older OS versions than I am running)

Then I could have a nice binary wheel that my users can pip install and
away they go. But:

1) They also need the Py_netCDF4 package, which may or may not be easy to
find. If not -- they need to go through all that build hell. Then they have
a package that is using a bunch of the same shared libs as mine -- and
hopefully no version conflicts...

2) my package is under development -- what I really want is for it to be
easy for my users to build from source, so they can keep it all up to date
from the repo. Now they need to get a development version of all those libs
up and running on their  machines -- a heavy lift for a lot of people.

So now - use Anaconda is a pretty good solution -- it provides all the
libs I need, and someone else has figured out how to build them on the
platforms I care about.

But it would be nice if we could find a way for the standard python
toolchain could support this.

NOTE: as someone suggested on this list was to provide (outside of PyPi +
pip), a set of static libs all built and configured, so that I could say:
install these libs from some_other_place, then you can build and run my
code -- 

Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-18 Thread Chris Barker
On Mon, May 18, 2015 at 10:50 AM, David Mertz dme...@continuum.io wrote:

 This pertains more to the other thread I started, but I'm sort of becoming
 convinced--especially by Paul Moore's suggestion there--that the better
 approach is to grow conda (the tool) rather than shoehorn conda packages
 into pip.


I agree -- in some sense conda is pip+more, you couldn't do it without
growing pip (see the other thread...)


 So it might make sense to actually allow folks to push conda to budding
 web developers, if conda allowed installation (and environment management)
 of sdist packages on PyPI.  So perhaps it would be good if *this* worked:

   % pip install conda
   % conda install scientific_stuff
   % conda install --sdist django_widget   # we know to look on PyPI


so a point / question here:

you can, right now, run pip from inside a conda environment python, and for
the most part, it works -- certainly for sdists. I'm actually doing that a
lot, and so are others.

But it gets messy when you have two systems trying to handle dependencies
-- pip may not realize that conda has already installed something, and vice
versa. So it's really nicer to have one package manager.

But maybe all you really need to do is teach conda to understand pip
meta-data, and/or make sure that conda write pip-compatible meta data.

Then a user could do:

conda install some_package

and conda would look it all its normal places for some_package, and if it
din't find it, it would try running pip install under the hood.

The user wouldn't know, or have to know, where the package came from
(though conda might want to add that to the meta-data for use come upgrade
time, etc.)

In short --make it easy for conda users to use pip / pypi packages.

Note: there has been various threads about this on the Anaconda list
lately. The current plan is to have a community binstar channel that
mirrors as much of pypi as possible. Until we have an automated way to grab
pypi packages for conda -- this isn't  bad stop gap.

Also note that conda can often (but not always) build a conda package from
pypi automagically -- someone could potentially run a service that does
that.

On Mon, May 18, 2015 at 3:17 AM, Paul Moore p.f.mo...@gmail.com wrote:

 Agreed. My personal use case is as a general programmer (mostly
 sysadmin and automation type of work) with some strong interest in
 business data analysis and a side interest in stats.

 For that sort of scenario, some of the scipy stack (specifically
 matplotlib and pandas and their dependencies) is really useful. But
 conda is *not* what I'd use for day to day work, so being able to
 install via pip is important to me.


What if conda install did work for virtually all pypi packages? (one way
or the other) -- would you use and recommend Anaconda (or miniconda) then?


 It should be noted that installing
 via pip *is* possible - via some of the relevant projects having
 published wheels, and the rest being available via Christoph Gohlke's
 site either as wheels or as wininsts that I can convert. But that's
 not a seamless process, so it's not something I'd be too happy
 explaining to a colleague should I want to share the workload for that
 type of thing.


right -- that could be made better right now -- or soon. Gohlke's packages
can't be simply put up on PyPi for licensing reasons (he's using the Intel
math libs). But some folks are working really hard on getting a numpy wheel
that will work virtually everywhere, and still give good performance for
numerics.  From there, the core SciPy  stack should follow (it's already on
PyPi for OS-X).

Which is a GREAT move in the right direction, but doesn't get us quite to
where PyPi can support the more complex packages.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-15 Thread Chris Barker
On Fri, May 15, 2015 at 1:49 AM, Paul Moore p.f.mo...@gmail.com wrote:

 On 14 May 2015 at 19:01, Chris Barker chris.bar...@noaa.gov wrote:
  Ah -- here is the issue -- but I think we HAVE pretty much got what we
 need
  here -- at least for Windows and OS-X. It depends what you mean by
  curated, but it seems we have a (defacto?) policy for PyPi: binary
 wheels
  should be compatible with the python.org builds. So while each package
 wheel
  is supplied by the package maintainer one way or another, rather than by
 a
  central entity, it is more or less curated -- or at least standardized.
 And
  if you are going to put a binary wheel up, you need to make sure it
 matches
  -- and that is less than trivial for packages that require a third party
  dependency -- but building the lib statically and then linking it in is
 not
  inherently easier than doing a dynamic link.

 I think the issue is that, if we have 5 different packages that depend
 on (say) libpng, and we're using dynamic builds, then how do those
 packages declare that they need access to libpng.dll?


this is the missing link -- it is a binary build dependency, not a package
dependency -- so not such much that matplotlib-1.4.3 depends on libpng.x.y,
but that:


matplotlib-1.4.3-cp27-none-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl

depends on:

libpng-x.y

(all those binary parts will come from the platform)

That's what's missing now.

And on Windows,
 where does the user put libpng.dll so that it gets picked up?


Well, here is the rub -- Windows dll hell really  is hell -- but I think it
goes into the python dll searchpath (sorry, not on a
Windows box where I can really check this out right now), it can work -- I
know have an in-house product that has multiple python modules sharing a
single dll somehow



 And how
 does a non-expert user do this (put it in $DIRECTORY, update your
 PATH, blah blah blah doesn't work for the average user)?


That's why we may need to update the tooling to handle this -- Im not
totally sure if the current wheel format can support this on Windows --
though it can on OS-X.

In particular, on Windows, note that the shared DLL must either be in
 the directory where the executable is located (which is fun when you
 have virtualenvs, embedded interpreters, etc), or on PATH (which has
 other implications - suppose I have an incompatible version of
 libpng.dll, from mingw, say, somewhere earlier on PATH).


that would be dll hell, yes.


 The problem isn't so much defining a standard ABI that shared DLLs
 need - as you say, that's a more or less solved problem on Windows -
 it's managing how those shared DLLs are made available to Python
 extensions. And *that* is what Unix package managers do for you, and
 Windows doesn't have a good solution for (other than bundle all the
 dependent DLLs with the app, or suffer DLL hell).


exactly -- but if we consider the python install to be the app, rather
than an individual python bundle, then we _may_ be OK.

PS For a fun exercise, it might be interesting to try breaking conda -


Windows really is simply broken [1] in this regard -- so I'm quite sure you
could break conda -- but it does seem to do a pretty good job of not being
broken easily by common uses -- I can't say I know enough about Windows dll
finding or conda to know how...

Oh, and conda is actually broken in this regard on OS-X at this point -- if
you compile your own extension in an anaconda environment, it will find a
shared lib at compile time that it won't find at run time. -- the conda
install process fixes these, but that's a pain when under development --
i.e. you don't want to have to actually install the package with conda to
run a test each time you re-build the dll.. (or even change a bit of python
code...)

But in short -- I'm pretty sure there is a way, on all systems, to have a
standard way to build extension modules, combined with a standard way to
install shared libs, so that a lib can be shared among multiple packages.
So the question remains:

Is there any point? or is the current approach of statically linking all
third party libs the way to go?

If so, then is there any chance of getting folks to conform to this
standard for PyPi hosted binary packages anyway? i.e. the curation problem.

Personally, I'm on the fence here -- I really want newbies to be able to
simply pip install as many packages as possible and get a good result
when they do it.

On the other hand, I've found that conda better supports this right now, so
it's easier for me to simply use that for my tools.


-Chris


[1] My take on dll hell:

a) it's inherently difficult -- which is why Linux provides a system
package manager.

b) however, Windows really does make it MORE difficult than it has to be:
  i) it looks first next the executable
  ii) it also looks on the PATH (rather than a separate DLL_PATH)
  Combine these two, and you have some folks dropping dlls next

Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-14 Thread Chris Barker
On Tue, Apr 14, 2015 at 8:41 AM, Nick Coghlan ncogh...@gmail.com wrote:

 The main two language independent solutions I've identified for this
 general user level package management problem in the Fedora
 Environments  Stacks context
 (
 https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManagement
 )
 are conda (http://conda.pydata.org/) and Nix
 (https://nixos.org/nix/about.html),


cool -- I hadn't seem nix before.


 From a Python upstream perspective, Nix falls a long way behind conda
 due to the fact that Nix currently relies on Cygwin for Windows
 support -


The other thing that's nice about conda is that while it was designed for
the general case, it has a lot of python-specific features. Being a Python
guy -- I llke that ;-) -- it may not work nearly as well for Ruby or what
have you -- I wouldn't know.


  The point where I draw the line is supporting *dynamic*
 linking between modules -


I'm confused -- you don't want a system to be able to install ONE version
of a lib that various python packages can all link to? That's really the
key use-case for me


 that's the capability I view as defining the
 boundary between enabling an add-on ecosystem for a programming
 language runtime and providing a comprehensive software development
 platform :)


Well, with it's target audience being scientific programmers, conda IS
trying to give you a comprehensive software development platform

We're talking about Python here -- it's a development tool. It turns out
that for scientific development, pure python is simply not enough -- hence
the need for conda and friends.

I guess this is what it comes down to -- I'm all for adding a few features
to wheel -- it would be nice to be abel to pip install most of what I, and
people like me, need. But maybe it's not possible -- you can solve the
shared lib problem, and the scripts problem, and maybe the menu entires
problem, but eventually, you end up with I want to use numba -- and then
you need LLVM, etc. -- and pretty soon you are building a tool that
provides a comprehensive software development platform. ;-)


-CHB

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-14 Thread Chris Barker

 If there’s a plugin that understands this extension
  installed, let it do something before you actually move the files into
  place”. This let’s Wheels themselves still be declarative and moves the
  responsibility of implementing these bits into their own PyPI projects
  that can be versioned and independently upgraded and such. We’d probably
  need some method of marking an extension as “critical” (e.g. bail out and
  don’t install this Wheel if you don’t have something that knows how to
 handle
  it) and then non critical extensions just get ignored if we don’t know
  how to handle it.


Could an extension be -- run this arbitrary Python script ?

We've got a full featured scripting language (with batteries included!) --
isn't that all the extension you need?

Or is this about security? We don't want to let a package do virtually
anything on install?

-CHB





 Right, this is the intent of the Required extension handling
 feature:
 https://www.python.org/dev/peps/pep-0426/#required-extension-handling

 If a package flags an extension as installer_must_handle, then
 attempts to install that package are supposed to fail if the installer
 doesn't recognise the extension. Otherwise, installers are free to
 ignore extensions they don't understand.

 So meta-installers like canopy could add their own extensions to their
 generated wheel files, flag those extensions as required, and other
 installers would correctly reject those wheels as unsupported.

 Cheers,
 Nick.

 --
 Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-15 Thread Chris Barker
On Tue, Apr 14, 2015 at 8:57 PM, Kevin Horn kevin.h...@gmail.com wrote:

 Personally, I'm not a fan of auto-installing,



 I'm with Paul on this one.  It seems to me that auto-installing the
 extension would destroy most of the advantages of distributing the
 extensions separately.


Exactly -- I actually tossed that one out there because I wanted to know
what folks were thinking, but also a bit of bait ;-) -- we've got a
conflict here:

1) These possible extensions are potentially dangerous, etc, and should be
well reviewed and not just tossed in there.

2) People (and I'm one of them) really, really want pip install to just
work. (or conda install or enpkg, or...). If it's going to just work,
then it needs to find and install the extensions auto-magically, and then
we're really not very far from running arbitrary code...

Would that be that different than the fact that installing a given package
automatically installs all sorts of other packages -- and most of us don't
give that a good review before running install...

(I just was showing off Shinx to a class last night -- quite amazing what
gets brought in with a pip install of sphinx (including pytz -- I have no
idea why). But at the end of the day, I don't care much either. I'm
trusting that the Sphinx folks aren't doing something ridiculous or
dangerous.

Which brings us back to the review of extensions thing -- I think it's
less about the end user checking it out and making a decision about it, but
about the package builder doing that. I have a package I want to be easy to
install on Windows -- so I go look for an extension that does the Start
Menu, etc. Indeed, that kind of thing 'should be part of pip and/or
wheel, but it would probably be more successful if it were done as third
party extensions -- perhaps over the years, the ones that rise to the top
of usefulness can become standards.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-16 Thread Chris Barker
On Wed, Apr 15, 2015 at 2:23 PM, Paul Moore p.f.mo...@gmail.com wrote:

 In the PEP, there's a concept of optional vs required extensions.
 See https://www.python.org/dev/peps/pep-0426/#required-extension-handling.
 This is crucial - I've no problem if a particular extension is used by
 a project, as long as it's optional. I won't install it, so it's fine.
 It seems to me that pip *has* to ignore missing optional extensions,
 for this reason. Of course, that introduces the converse problem,
 which is how would people who might want that extension to be
 activated, know that a project used it?


Exactly -- we do want pip install to just work...


  But I worry that some people may have a more liberal definition
 of required than I do.


They probably do -- if they want things to just work

We have the same problem with optional dependencies.

For instance, for iPython to work, you don't need much. but if you want the
ipython notebook to work, you need tornado, zeromq, who knows what else.
But people want it to just work -- and just work be default, so you want
all that optional stuff to go in by default.

I expect this is the same with wheel installer extensions. To use your
example, for instance. People want to do:

pip install sphinx

and then have the sphinx-quickstart utility ready to go. by default. So
scripts need to be installed by default.

The trade-off between convenience and control/security is tough.


 Based on the above, it's possibly valid to allow required extensions
 to be auto-installed. It *is* a vector for unexpected code execution,
 but maybe that's OK.


If even required extensions aren't auto installed, then we can just toss
out the whole idea of automatic dependency management. (which I personally
wouldn't mind, actually, but I'm weird that way)

But maybe we need some real use cases to talk about -- I agree with
others in this thread that the Start menu isn't a good example.

-Chris




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-13 Thread Chris Barker
NOTE: I don't work for any of the companies involved -- just a somewhat
frustrated user... And someone that has been trying for years to make
things easier for OS-X users.

I’m not sure what (3) means exactly. What is a “normal” Python, do you
 modify Python in a way that breaks the ABI but which isn’t reflected in the
 standard ABI tag?


 It could be multiple things. The most obvious one is that generally.
 cross-platforms python distributions will try to be relocatable (i.e. the
 whole installation can be moved and still work). This means they require
 python itself to be built a special way. Strictly speaking, it is not an
 ABI issue, but the result is the same though: you can't use libraries from
 anaconda or canopy on top of a normal python


But why not? -- at least for Anaconda, it's because those libraries likely
have non-python dependencies, which are expected to be installed in a
particular way. And really, this is not particular to Anaconda/Canopy at
all. Python itself has no answer for this issue, and eggs and wheels don't
help. Well, maybe kinda sorta they do, but in a clunky/ugly way: in order
to build a binary wheel with non-python dependencies (let's say something
like libjpeg, for instance), you need to either:
 - assume that libjpeg is installed in a standard place -- really no
solution at all (at least outside of linux)
 - statically link it
 - ship the dynamic lib with the package

For the most part, the accepted solution for OS-X has been to statically
link, but:

 - it's a pain to do. The gnu toolchain really likes to use dynamic
linking, and building a static lib that will run on a
maybe-older-than-the-build-system machine is pretty tricky.

 - now we end up with multiple copies of the same lib in the python
install. There are a handful of libs that are used a LOT. Maybe there is no
real downside -- disk space and memory are cheap these days, but it sure
feels ugly. And I have yet to feel comfortable with having multiple
versions of the same lib linked into one python instance -- I can't say
I've seen a problem, but it makes me nervous.

On Windows, the choices are the same, except that: It is so much harder to
build many of the standard open source libs that package authors are more
likely to do it for folks, and you do get the occasional dll hell issues.

I had a plan to make some binary wheels for OS-X that were not really
python packages, but actually just bundled up libs, so that other wheels
could depend on them. OS-X does allow linking to relative paths, so this
should have been doable, but I never got anyone else to agree this was a
good idea, and I never found the roundtoits anyway. And it doesn't really
fit into the PyPi, pip, wheel, etc. philosphy to have dependencies that are
platform dependent and even worse, build-dependent.

Meanwhile, conda was chugging along and getting a lot of momentum in the
Scientific community. And the core thing here is that conda was designed
from the ground up to support essentially anything, This means is supports
python packages that depend on non-python packages, but also supports
packages that have nothing to do with python (Perl, command line tools,
what have you...)

So I have been focusing on conda lately.

Which brings me back to the question: should the python tools (i.e. wheel)
be extended to support more use-cases, specifically non-python
dependencies? Or do we just figure that that's a problem better solved by
projects with a larger scope (i.e. rpm, deb, conda, canopy).

I'm on the fence here. I mostly care about Python, and I think we're pretty
darn close with allowing wheel to support the non-python dependencies,
which would allow us all to simply pip install pretty much anything --
that would be cool. But maybe it's a bit of a slippery slope, and if we go
there, we'll end up re-writing conda.

BTW, while you can't generally install a conda package in/for another
python, you can generally install a wheel in a conda pythonThere are a
few issues with pip/setuptools trying to resolve dependencies while not
knowing about conda packages, but it does mostly work.

Not sure that helped the discussion -- but I've been wrestling with this
for a while, so thought I'd get my thoughts out there.


-Chris

















-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Beyond wheels 1.0: helping downstream, FHS and more

2015-04-14 Thread Chris Barker
On Tue, Apr 14, 2015 at 9:46 AM, Paul Moore p.f.mo...@gmail.com wrote:

  Could an extension be -- run this arbitrary Python script ?

 The main point (as I see it) of an extension is that it's
 distributed independently of the packages that use it. So you get to
 decide to use an extension (and by inference audit it if you want)
 *before* it gets run as part of an installation.


OK, I think this is getting clearer to me now -- an Extension is (I suppose
arbitrary) block of python code, but what goes into the wheel is not the
code, but rather a declarative configuration for the extension. then at
install-time, the actual code that runs is separate from the wheel, which
gives the end user greater control, plus these nifty features


 Extensions get peer
 review by the community, and bad ones get weeded out,



  the independent review and quality control



  there's also portability. And code quality. And
 consistency.


And I'll add that this would promote code re-use and DRY.

I'd be much happier installing a project that used a well-known start
 menu manager extension


So where would that code live? and how would it be managed? I'm thinking:
 - In package on PyPi like anything else
 - a specification in install_requires
 - pip auto-installs it (if not already there) when the user goes to
install the wheel.

Is that the idea?

Of course, if the project I want to install makes using the extension
 mandatory for the install to work, I still don't have a real choice -
 I accept the extension or I can't use the code I want -


well, you can't easily auto-install it anyway -- you could still do a
source install, presumably.

but there's an
 extra level of transparency involved. And hopefully most extensions
 will be optional, in practice.


There's a bit to think about in the API/UI here. If an
installation_extension is used by a package, and it's specified in
install_requires, then it's going to get auto-magically installed an used
with a regular old pip install. If we are worried about code review and
users being in control of what extensions they use, then how to we make it
obvious that a given extension is in use, but optional, and how to turn it
off if you  want?

-CHB



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-19 Thread Chris Barker
On Tue, May 19, 2015 at 4:27 AM, Oscar Benjamin oscar.j.benja...@gmail.com
wrote:


 Surely the best way to manage non-Python shared libs is by
 exposing them as extension modules which can be packaged up on PyPI.
 Then you have dependency resolution for pip, you don't need to worry
 about the OS-specific shared library loading details and ABI
 information can be stored as metadata in the module. It would even be
 possible to load multiple versions or ABIs of the same library as
 differently named Python modules IIUC.


yes, that's what I proposed earlier in this thread. I put proposed in
quotes because it was buried in the discussion, and not the least bit
fleshed out, but that was the point.

As a case in point numpy packages up a load of C code and wraps a
 BLAS/Lapack library. Many other extension modules are written which
 can all take advantage of the non-Python shared libraries that embody
 numpy via its C API.


Though to be fair -- managing that has been a challenge -- numpy may be
built with different versions of BLAS, and there is no way to really know
what you might be getting... but I think this is solved with the curated
packages approach.


 Is there some reason that this is not considered a good solution?


Well, I've had some issues with getting the linking worked out right so
that modules could use each-other's libraries, at least on Windows -- but
that can probably be worked out.

The missing piece in terms of the PyPA infrastructure is that that is no
way to specify a binary dependency in the meta data: we can specify that
this python package depends on some other python package, but not that this
particular binary wheel depends on some other binary wheel. For example,
the same python package (say something like PIL) might use the system libs
when built on Linux, some of the system libs when built for OS-X, and need
all the third party libs when built for Windows. So the OS-X wheel has
different dependencies than the Windows wheel, which has different
dependencies than, say, a conda package of the same lib. Then there are
wheels that might be built to use the homebrew libs on OS-X -- it gets
messy!

But that can be hacked around, so that we could give it a try and see how
it works.

The other issue is social: this would really only be a benefit if a wide
variety of packages shared the same libs -- but each of those packages is
maintained by different individuals and communities. So it's had to know if
it would get used. I could put up a libpng wheel, for instance, and who
knows if the Pillow folks would have any interest in using it? or the
matplotlib folks, or, ... And this would be particularly difficult when the
solution was hacked together...

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-19 Thread Chris Barker
On Mon, May 18, 2015 at 8:24 PM, Vincent Povirk madewokh...@gmail.com
wrote:

  But maybe all you really need to do is teach conda to understand pip
  meta-data, and/or make sure that conda write pip-compatible meta data.

 Forgive me, I'm trying to follow as someone who is working with PyPI
 but hasn't really used conda or pip. Does a conda environment contain
 its own site-packages directory,


If python was installed by conda, yes.  I get a bit confused here. For one
I have only used conda with Anaconda, and Anaconda starts you off with a
python environment (for one thing, conda itself is written in Python, so
you need that...).

So if one were to start from scratch with conda, I'm not entirely sure what
you would get. but I _think_ you could run conda with some_random_python,
then use it to set up a conda environment with it's own python...

But for the current discussion, yes, a conda environment has it's own
site-packages, etc, its own complete python install.

and does pip correctly install
 packages to that directory?


yes. it does.


 If so, I expect supporting PEP 376 would
 help with this.


yes, I think that would help. though conda is about more-than-python, so it
would still need to manage dependency and meta-data its own way. But I
would think it could duplicate the effort python packages. But I can't
speak for the conda developers.

It doesn't help either package manager install dependencies from
 outside their repos, it just means that pip will work if the user
 installs dependencies from conda first.


and if we can get vice-versa to work, also -- things would be easier.


 To be able to install
 dependencies, either conda needs to know enough about PyPI to find a
 package's dependencies itself (and at that point, I wonder how much
 value pip adds compared to 'wheel'),


good point -- and conda does know a fair bit about PyPi already -- there is
a

conda-skeleton  pypi

command that goes and looks on pypi fo ra package, and builds a conda build
script for it automagically -- and it works without modification much of
the time, including dependencies.

So much of the logic is there.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-19 Thread Chris Barker
On Tue, May 19, 2015 at 9:15 AM, David Cournapeau courn...@gmail.com
wrote:

 Honestly, I still haven't seen a solid explanation of why (at least on
 Windows) static linking isn't a viable option.


well - it does get us pretty far


 Because some libraries simply don't work as static libraries, or are too
 big (MKL comes to mind). Also, we have been historically using static libs
 for our eggs at Enthought on windows, and it has been a nightmare to
 support. It just does not scale when you have 100s of packages.


there is also the issue of semi-developers -- I want people to be able to
easily build my package, that depends on a bunch of libs that I really want
to be the same. I suppose I could deliver the static libs themselves, along
with the headers, etc, but that does get ugly.


 But really, once wheels support arbitrary file locations, this becomes
 fairly easy at the packaging level: the remaining issue is one of ABI /
 standards, but that's mostly a non technical issue.


yup -- social issues are the big one.


 Gholke has 100s of packages using wheels,


doesn't he ship the dlls with the packages, even when  not using static
libs? so that multiple packages may have the same dll? which is almost the
same as a static lib.


 and we ourselves at Enthought have close to 500 packages for windows, all
 packaged as eggs, maybe 30-40 % of which are not python but libs, C/C++
 programs, etc... It is more about agreeing about a common way of doing
 things than a real technical limitation.


good to know -- I suspected as much but haven't tried it yet.

If someone were to
 create and publish a Python compatible static .lib file for the
 various hard-to-build dependencies, extensions could specify that you
 link with it in setup.py, and all the person building the wheel has to
 do is download the needed libraries for the build.


OK, but from a social perspective, this is unlikely to happen (it hasn't
yet!), without some official support on PyPi, with pip, or ???

So even if  static is the way to go -- there is an infrastructure need.

If there's a technical reason why dynamic linking at runtime is better
 than static linking (sufficiently better that it justifies all the
 effort needed to resolve the issues involved),


I'm not sure the issues are that much greater -- we could build binary
wheels that hold dynamic libs, and put them up on PyPi -- then other
package maintainers would need to actually use those -- almost the same
thing as getting a variety of package maintainers to use this mythical
repository of static libs.

(and by the way, gcc makes it remarkably had to force a static build)

Another issue I've run  into is nested static libs -- you have to change
yoru setup.py in special ways for that to work:

libnetcdf depends on libhdf5, depends on libcurl and libz and...

getting all that statically linked into my extension is tricky.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 4:16 PM, Donald Stufft don...@stufft.io wrote:

 On Sat, May 16, 2015 at 3:03 PM, Donald Stufft don...@stufft.io wrote:

 There are a few other benefits, but that’s not anything that are inherent
 in the two different approaches, it’s just things that conda has that pip
 is planning on getting,


 Huh? I'm confused -- didn't we just have a big thread about how pip+wheel
 probably ISN'T going to handle shared libs -- that those are exactly what
 conda packages do provide -- aside from R and Erlange, anyway :-)

 but it's not the packages in this case that we need -- it's the
 environment -- and I can't see how pip is going to provide a conda
 environment….


 I never said pip was going to provide an environment, I said the main
 benefit conda has over pip, which pip will most likely not get in any
 reasonable time frame, is that it handles things which are not Python
 packages.


well, I got a bit distraced by Erlang and R -- i.e. things that have
nothing to do with python packages.

libxml, on the other hand, is a lib that one might want to use with a
python package -- so a bit more apropos here.

But my confusion was about: things that conda has that pip is planning on
getting -- what are those things? Any of the stuff that conda has that
really useful like handling shared libs, pip is NOT getting -- yes?


 A shared library is not a Python package so I’m not sure what this message
 is even saying? ``pip install lxml-from-conda`` is just going to flat out
 break because pip won’t install the libxml2 shared library.


exactly -- if you're going to install a shared lib, you need somewhere to
put it -- and that's what a conda environment provides.

Trying not to go around in circles, but python _could_ provide a standard
place in which to put shared libs -- and then pip _could_ provide a way to
manage them. That would require dealing with that whole binary API problem,
so we probably won't do it. I'm not sure what the point of contention is
here:

I think it would be useful to have a way to manage shared libs solely for
python packages to use -- and it would be useful to that way to be part of
the standard python ecosytem. Others may not think it would be useful
enough to be worth the pain in the neck it would be.

And that's what the nifty conda packages continuum (and others) have built
could provide -- those shared libs that are built in a  compatible way with
a python binary. After all, pure python packages are no problem, compiled
python packages without any dependencies are little problem. The hard part
is those darn third party libs.

conda also provides a way to mange all sorts of other stuff that has
nothing to do with python, but I'm guessing  that's not what continuum
would like to contribute to pypi

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-16 Thread Chris Barker
On Sat, May 16, 2015 at 3:03 PM, Donald Stufft don...@stufft.io wrote:

 There are a few other benefits, but that’s not anything that are inherent
 in the two different approaches, it’s just things that conda has that pip
 is planning on getting,


Huh? I'm confused -- didn't we just have a big thread about how pip+wheel
probably ISN'T going to handle shared libs -- that those are exactly what
conda packages do provide -- aside from R and Erlange, anyway :-)

but it's not the packages in this case that we need -- it's the environment
-- and I can't see how pip is going to provide a conda environment

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Help required for setup.py

2015-05-20 Thread Chris Barker
On Tue, May 19, 2015 at 4:12 PM, salil GK gksa...@gmail.com wrote:

   I will provide more details about what I need to achieve

 I need to create a package for a tool that I create. Actually the tool
 that I created is a wrapper over ovftool which is provided by VMWare.
 ovftool install binary is provided as a bundle hence there is no package
 installed in the system ( `dpkg -l`  will not list ovftool package ).
 ovftool will be installed in /usr/bin/ location.

While creating the package I need to check if ovftool is available in
 the system and the version is 4.1.0. If it is not compatible, I need to
 fail the package installation with proper message. So how do I write
 setup.py for achieving the same.


you can put arbitrary python code in setup.py. so before you call setup()
in the file, put something like:

import subprocess

try:
version = subprocess.check_output(['/usr/bin/ovftool','--version'])
except subprocess.CalledProcessError:
print ovftool is not properly installed
raise
if not is_this_the_right_version(version):
raise ValueError(ovftool is not the right version)


of course, you'd probably want better error messages, etc, but hopefully
you get the idea.

-CHB







 Thanks
 Salil

 On 19 May 2015 at 07:54, salil GK gksa...@gmail.com wrote:

 Hello

I was trying to create my package for distribution. I have a
 requirement that I need to check if one particular command is available in
 the system ( this command is not installed through a package - but a bundle
 is installed to get the command in the system ). I am using Ubuntu 14.04

 Thanks in advance
 Salil



 ___
 Distutils-SIG maillist  -  Distutils-SIG@python.org
 https://mail.python.org/mailman/listinfo/distutils-sig




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Chris Barker
On Wed, May 20, 2015 at 6:30 AM, Daniel Holth dho...@gmail.com wrote:

 The package includes its build recipe in info/recipe


very cool -- I hadn't seen that -- I'll go take a look at some packages and
see what I can find.

-CHB

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Chris Barker

 The package includes its build recipe in info/recipe


 very cool -- I hadn't seen that -- I'll go take a look at some packages
 and see what I can find.


Darn -- the recipe is not there in most (all?) of the packages that came
from Anaconda -- probably due to the legacy issues David referred to.

And since a conda package is just a tar archive, you can presumably build
them in other ways than a conda build recipe.

By the way -- libxml is one example of one without a recipe...

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Chris Barker
On Wed, May 20, 2015 at 1:04 AM, Paul Moore p.f.mo...@gmail.com wrote:

  https://github.com/menpo/conda-recipes/tree/master/libxml2
 
  don't know anything about it.

 OK, I'm still misunderstanding something, I think. As far as I can
 see, all that does is copy a published binary and repack it. There's
 no build instructions in there.


indeed -- that is one way to buld a conda pacakge, as you well know!

maybe no one has done a proper build from scratch recipe for that one --
or maybe continuum has, and we'll find out about it from David

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-19 Thread Chris Barker
On Tue, May 19, 2015 at 3:09 PM, Paul Moore p.f.mo...@gmail.com wrote:


 So, for example the process for building the pyyaml package available
 via conda is private?


well, I haven't been able to find them... I don't know if continuum keeps
them private on purpose or, just haven't happened to publish them.


 That seems like a
 rather striking downside to conda that I wasn't aware of.


We need to be careful here about what is:

conda -- a fully open source package management system
Anaconda -- a python and other stuff distribution produced by Continuum.

How Continuum does or doesn't publish the recipes it used to build
 Anaconda doesn't really have anything to do with conda-the-technology.

On Tue, May 19, 2015 at 3:23 PM, David Mertz dme...@continuum.io wrote:

 It is certainly not our intention at Continuum to keep build recipes
 private.



  I'll add it to my TODO list to work on making sure that those are better
 updated and maintained at https://github.com/conda/conda-recipes.


That would be great!


I will note that most recipes seem to consist of either 'python setup.py
 install' or './configure; make; make install'.


sure -- but those aren't the ones we want ;-)


   So there is quite likely actually little significant work that has
 failed to have been published.  But I'm not sure of pyyaml off the top of
 my head, and how that is built.


see if you can find the wxPython one, while you are at it :-)
  --  though I suspect that was built from the official executable,
rather than re-built from scratch.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


  1   2   3   4   >