Re: [Distutils] Working environment

2006-03-11 Thread Jim Fulton
Ian Bicking wrote:
 So, coming back to the idea of a working environment, an isolated and 
 more-or-less self-contained environment for holding installed packages. 
   Sorry if this is a little scattered.  I'm just summarizing my thoughts 
 and the open issues I see, in no particular order (because I'm not sure 
 what order to approach these things in).
 
 I'm assuming such an environment will be encapsulated in a single 
 directory, looking something like:
 
 env/
bin/
lib/python2.4/
src/
conf/
 
 The conf/ directory doesn't really relate to much of this specifically, 
 but in many situations it would be useful.  Depending on the situation, 
 other subdirectories may exist.

OK, I'll pretend that it doesn't exist. ;) Actually, I'll have a bit to say
about configuration below.

 Each of the scripts in bin/ should know what their working environment 
 is.  This is slightly tricky, depending on what that means.  If it is a 
 totally isolated environment -- no site-packages on sys.path -- then I 
 feel like the script wrappers have to be shell scripts, to invoke Python 
 with -S (which is hard to do portably on the #! line).

I'll note that this isn't important to me at all.  I'm not opposed
to allowing it, but I don't need it.

I never use the python that comes with my system. I always
build Python from source and I can therefore keep that install
clean, complete, and relatively predictable.  (I am often
annoyed by Python's *implicit* decision of which extension
modules to install. Hopefully, someday, the presense of a
packaging system will enable saner approaches to optional
extensions.) In fact, I usually have multiple versions of
Python available.  I then will have many working environments
that use these custtom Pythons. (The custom Python's are not part
of any working environment.)

Similarly, in production environments, we install custom builds
of Python that our applications use.

...

 lib/python2.4/ is for packages. 

Minor note: this needs to be flexible.  I'd be more inclined to go
with something shallower and simpler, like just lib,


  I'm almost inclined to say that
 --single-version-externally-managed makes sense on some level, with a 
 record kept in some standard place (lib/python2.4/install-record.txt?) 
 -- but I'm basically indifferent.  I at least don't see a need for 
 multiple simultaneous versions in this setup, and multiple versions do 
 lead to confusion.  Version metadata is still nice, of course.

Increasingly, projects I work on involve multiple Python applications
with each application requiring different sets of packages and, sometimes,
different package versions.  Basically, I want a single version for each
application, but multiple versions in the environment.  What appeals to me
is a compromise between the single-version and multi-version install.

With a single-version install, you end up with an easy-install.pth
that names the packages (and package versions) being used.  This has the
advantage of being deterministic.  You always get the packages listed
there.  This affects all applications running in the environment.

In a multi-version install, multiple versions are stored in the
environment.  Different applications can use different versions.
When an application is run, a root package is specified in a script
(the package containing the entry point) and pkg_resources tries to find
a suitable combination of packages that satisfy the closure of the root
package's dependencies.  This process is somewhat non-deterministic.
Installation of a new package could change the set of packages used
beyond that individual package, causing implicit and unintended upgrades
that you refered to in your talk at PyCon.

What I want (or think I want :) is per-application (per-script) .pth
files.  When I install a script, I want the packages needed by the script
to be determined at that time and written to a script-specific .pth file.
So, for example, if I install a script foo (foo.exe, whatever), I also
get foo.pth and have the generated foo script prepend the contents of that
file to sys.path on startup.  I don't mind if that file is required
to be in the same directory as the script.  I also wouldn't mind if the
contents if the path entries were simply embedded in the script.  If I
want to upgrade the packages being used, I simply update the scripts.
There are veriopus ways that this could be made easy.  Even better,
the upgrade process could optionally make backups of the .pth files
or scripts, allowing easy rollbacks of upgrades.

I think that this approach combines the benefits of single- and multi-version
install.  Different applications in an enevironment can require different
package versions.  The choice of packages for an application is determined at
discrete explicit install times, rather than at run time.

...


 Installation as a whole is an open issue.  Putting in env/setup.cfg with 
 the setting specific to that working environment works to a degree -- 
 

[Distutils] working-env.py

2006-03-11 Thread Ian Bicking
I wrote a script last night that implements some of the ideas of a 
working environment, similar in goals to virtual-python.py, but it 
doesn't actually work like that. 
http://svn.colorstudy.com/home/ianb/working-env.py

When you run python working-env.py env you get:

env/
   bin/
 activate (something you source in your shell to activate this 
environment)
   conf/
   src/
   lib/python2.4/
 distutils/__init__.py
 distutils/distutils.cfg
 setuptools/__init__.py
 setuptools.pth
 site.py


You activate the environment by adding env/lib/python2.4 to your 
PYTHONPATH, and maybe adding env/bin/ to your PATH.

The custom site.py accomplishes most of the work, I guess.  It's just 
like the normal site.py, except it doesn't add site-packages (I should 
probably make an option so you can set it up either way). 
distutils/__init__.py is Ronald Oussoren's suggestion for monkeypatching 
distutils.  It both manages to get the working environment's 
distutils.cfg to be used (comes for free, since distutils.dist looks 
alongside distutils.__file__), and it substitutes __WORKING__ for the 
actual working directory.

The setuptools monkeypatch changes how scripts are generated.  But I 
don't like how it works.  It changes scripts to look more like:

   #!/usr/bin/python -S
   import sys, os
   join, dirname = os.path.join, os.path.dirname
   lib_dir = join(dirname(dirname(__file__)), 'lib', 'python%s.%s' % 
tuple(sys.version_info[:2]))
   sys.path.insert(0, lib_dir)
   import site
   ... normal stuff ...

Monkeypatching distutils is fine, because it's not going anywhere, but 
monkeypatching setuptools is not going to be so reliable.  But I guess 
we just need to figure out exactly what we want to do, then move those 
changes to setuptools.  Also, distutils should probably be monkeypatched 
directly in some fashion, so that distutils install also installs 
scripts with this header.

A goal I have -- that is pretty much accomplished so far -- is that the 
working environment be movable, and all paths be relative.  But 
setuptools creates some absolute directory names in .pth files, I believe.

-- 
Ian Bicking  |  [EMAIL PROTECTED]  |  http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Working environment

2006-03-11 Thread Ian Bicking
Jim Fulton wrote:
 Each of the scripts in bin/ should know what their working environment 
 is.  This is slightly tricky, depending on what that means.  If it is 
 a totally isolated environment -- no site-packages on sys.path -- then 
 I feel like the script wrappers have to be shell scripts, to invoke 
 Python with -S (which is hard to do portably on the #! line).
 
 I'll note that this isn't important to me at all.  I'm not opposed
 to allowing it, but I don't need it.

I can go both ways.  I think this should be configurable when you set up 
the working environment, hopefully someplace where it is easy to change.

 lib/python2.4/ is for packages. 
 
 Minor note: this needs to be flexible.  I'd be more inclined to go
 with something shallower and simpler, like just lib,

Why?  Top-level packages aren't portable, since .pyc files aren't 
portable.  Eggs are portable, since they contain the Python version.

   I'm almost inclined to say that
 --single-version-externally-managed makes sense on some level, with a 
 record kept in some standard place (lib/python2.4/install-record.txt?) 
 -- but I'm basically indifferent.  I at least don't see a need for 
 multiple simultaneous versions in this setup, and multiple versions do 
 lead to confusion.  Version metadata is still nice, of course.
 
 Increasingly, projects I work on involve multiple Python applications
 with each application requiring different sets of packages and, sometimes,
 different package versions.  Basically, I want a single version for each
 application, but multiple versions in the environment.  What appeals to me
 is a compromise between the single-version and multi-version install.

It's probably not important to get rid of an existing feature of 
setuptools.  The problems I have are better served with better tool 
support.  Mostly I get a huge number of old and expired eggs sitting 
around, and there needs to be a collection process, probably a process 
that is run on every installation.  I guess the collection would start 
from all the packages listed in .pth files, and maybe a collection of 
scripts, and then anything those packages require.  And anything left 
over is garbage.

In part I'm personally moving to using setup.py develop installs for 
more software, and that's more naturally single-version (though I 
suppose it doesn't have to be).

 Installation as a whole is an open issue.  Putting in env/setup.cfg 
 with the setting specific to that working environment works to a 
 degree -- easy_install will pick it up if invoked from there.  But 
 that doesn't work with setup.py develop, or setup.py install, or some 
 other scenarios. 
 
 I don't follow this.  It seems to work for us, at least for
 setup.py develop.  The main lamosity is depending on the current working
 directory.

I don't want users to have to give particular options to manage the 
installation.  I want them to activate a specific environment, and for 
everything to just work.  working-env.py mostly works like this.

 This brings me to the topic of configuration.  Today, I write wrapper
 scripts by hand,  I may have some application like Zope, or ZEO
 or our test runner that is implemented by a an entry point in a module.
 Then there's a wrapper script that imports the module and calls the 
 entry point.
 The wrapper script is written (manually or with some custom installation
 script) to include the path to be used and configuration data,
 which may be the location of a configuration file.  I really like
 the fact that easy_install will generate wrapper scripts for me, but
 I really need more control over how these scripts are generated to
 include *both* path and configuration information.

I'm not sure what to think of this.  I don't think of it as a script. 
It's like a specific invocation of the script.  A shell script.  Maybe 
we can improve on shell scripts, but I think it's a different idea than 
the script alone.



-- 
Ian Bicking  |  [EMAIL PROTECTED]  |  http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Jython support?

2006-03-11 Thread Phillip J. Eby
At 01:40 PM 3/11/2006 -0500, Brad Clements wrote:
I would like to use setuptools, eggs with entry points, and so forth
with Jython.

Is this possible?

Not as far as I know.


If not, what's needed?

A release of Jython that's equivalent to Python 2.3 or better, including 
PEP 302 support and the distutils.

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] working-env.py

2006-03-11 Thread Phillip J. Eby
At 03:22 PM 3/11/2006 -0600, Ian Bicking wrote:
#!/usr/bin/python -S
import sys, os
join, dirname = os.path.join, os.path.dirname
lib_dir = join(dirname(dirname(__file__)), 'lib', 'python%s.%s' %
tuple(sys.version_info[:2]))
sys.path.insert(0, lib_dir)
import site
... normal stuff ...

FYI, I plan to write a proposal next week for script path freezing, along 
the lines of what I talked with you and Jim Fulton at PyCon about.  The 
basic idea is that there'll be a '.pylibs' file alongside the script (e.g. 
'foo.pylibs' alongside a 'foo' script) that lists what should be at the 
front of sys.path.

There are still some details to be worked out, but the basic idea is that 
this would happen in such a way as to ensure that precisely the libraries 
needed by the script would be first on sys.path, so that it would inherit 
default versions from the environment for plugins or dynamic dependencies only.

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] working-env.py

2006-03-11 Thread Ian Bicking
Phillip J. Eby wrote:
 At 03:22 PM 3/11/2006 -0600, Ian Bicking wrote:
#!/usr/bin/python -S
import sys, os
join, dirname = os.path.join, os.path.dirname
lib_dir = join(dirname(dirname(__file__)), 'lib', 'python%s.%s' %
 tuple(sys.version_info[:2]))
sys.path.insert(0, lib_dir)
import site
... normal stuff ...
 
 FYI, I plan to write a proposal next week for script path freezing, 
 along the lines of what I talked with you and Jim Fulton at PyCon 
 about.  The basic idea is that there'll be a '.pylibs' file alongside 
 the script (e.g. 'foo.pylibs' alongside a 'foo' script) that lists what 
 should be at the front of sys.path.

Will this happen before the import of site.py?  I would very much like 
it to.


-- 
Ian Bicking  |  [EMAIL PROTECTED]  |  http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Working environment

2006-03-11 Thread Jim Fulton
Ian Bicking wrote:
 Jim Fulton wrote:
 
...
 lib/python2.4/ is for packages. 


 Minor note: this needs to be flexible.  I'd be more inclined to go
 with something shallower and simpler, like just lib,
 
 
 Why?  Top-level packages aren't portable, since .pyc files aren't 
 portable.  Eggs are portable, since they contain the Python version.

I have no idea what you are saying or how it relates to whether or not
packages go in lib/python2.4 or lib.

...

 This brings me to the topic of configuration.  Today, I write wrapper
 scripts by hand,  I may have some application like Zope, or ZEO
 or our test runner that is implemented by a an entry point in a module.
 Then there's a wrapper script that imports the module and calls the 
 entry point.
 The wrapper script is written (manually or with some custom installation
 script) to include the path to be used and configuration data,
 which may be the location of a configuration file.  I really like
 the fact that easy_install will generate wrapper scripts for me, but
 I really need more control over how these scripts are generated to
 include *both* path and configuration information.
 
 
 I'm not sure what to think of this.  I don't think of it as a script. 
 It's like a specific invocation of the script.  A shell script.  Maybe
 we can improve on shell scripts, but I think it's a different idea than 
 the script alone.

What it are you talking about?

Jim

-- 
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Working environment

2006-03-11 Thread Ian Bicking
Jim Fulton wrote:
 Ian Bicking wrote:
 Jim Fulton wrote:

 ...
 lib/python2.4/ is for packages. 


 Minor note: this needs to be flexible.  I'd be more inclined to go
 with something shallower and simpler, like just lib,


 Why?  Top-level packages aren't portable, since .pyc files aren't 
 portable.  Eggs are portable, since they contain the Python version.
 
 I have no idea what you are saying or how it relates to whether or not
 packages go in lib/python2.4 or lib.

lib/foo/__init__.pyc is a file that is specific to a version of Python. 
  lib/python2.4/foo/__init__.pyc removes any possibility of conflict. 
Though I suppose it is arguable that a working environment should only 
support one major version of Python.

 This brings me to the topic of configuration.  Today, I write wrapper
 scripts by hand,  I may have some application like Zope, or ZEO
 or our test runner that is implemented by a an entry point in a module.
 Then there's a wrapper script that imports the module and calls the 
 entry point.
 The wrapper script is written (manually or with some custom installation
 script) to include the path to be used and configuration data,
 which may be the location of a configuration file.  I really like
 the fact that easy_install will generate wrapper scripts for me, but
 I really need more control over how these scripts are generated to
 include *both* path and configuration information.


 I'm not sure what to think of this.  I don't think of it as a script. 
 It's like a specific invocation of the script.  A shell script.  Maybe
 we can improve on shell scripts, but I think it's a different idea 
 than the script alone.
 
 What it are you talking about?

This script+config invocation.

-- 
Ian Bicking  |  [EMAIL PROTECTED]  |  http://blog.ianbicking.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
http://mail.python.org/mailman/listinfo/distutils-sig