On Sat, Nov 29, 2014 at 8:37 PM, Nathaniel Smith <n...@pobox.com> wrote:
[...]
> The python-ideas thread did also consider several methods of
> implementing strategy (c), but they're messy enough that I left them
> out here. The problem is that somehow we have to execute code to
> create the new subtype *before* we have an entry in sys.modules for
> the package that contains the code for the subtype. So one option
> would be to add a new rule, that if a file pkgname/__new__.py exists,
> then this is executed first and is required to set up
> sys.modules["pkgname"] before we exec pkgname/__init__.py. So
> pkgname/__new__.py might look like:
>
>     import sys
>     from pkgname._metamodule import MyModuleSubtype
>     sys.modules[__name__] = MyModuleSubtype(__name__, docstring)

> This runs into a lot of problems though. To start with, the 'from
> pkgname._metamodule ...' line is an infinite loop, b/c this is the
> code used to create sys.modules["pkgname"].

As Greg Ewing said – you don't want to import from the package whose
metamodule you're defining. You'd want to do as little work as
possible in __new__.py.

I'd use something like this:

    import types

    class __metamodule__(types.ModuleType):
        def __call__(self):
            return self.main()

where Python would get the attribute __metamodule__ from __new__.py,
and use `__metamodule__(name, doc)` as the thing to execute __main__
in.

> It's not clear where the
> globals dict for executing __new__.py comes from (who defines
> __name__? Currently that's done by ModuleType.__init__).

Well, it could still be in __metamodule__.__init__().

> It only works for packages, not modules.

I don't see a need for this treatment for modules in a package – if
you want `from mypkg import callme`, you can make "callme" a function
rather than a callable module. If you *also* want `from mypkg.callme
import something_else`, I say you should split "callme" into two
differently named things; names are cheap inside a package.
If really needed, modules in a package can use an import hook defined
in the package, or be converted to subpackages.
Single-module projects would be left out, yes – but those can be
simply converted to a package.

> The need to provide the docstring here,
> before __init__.py is even read, is weird.

Does it have to be before __init__.py is read? Can't __init__.py be
compiled beforehand, to get __doc__, and only *run* in the new
namespace?
(Or should __new__.py define import hooks that say how __init__.py
should be loaded/compiled? I don't see a case for that.)

> It adds extra stat() calls to every package lookup.

Fair.

> And, the biggest showstopper IMHO: AFAICT
> it's impossible to write a polyfill to support this code on old python
> versions, so it's useless to any package which needs to keep
> compatibility with 2.7 (or even 3.4). Sure, you can backport the whole
> import system like importlib2, but telling everyone that they need to
> replace every 'import numpy' with 'import importlib2; import numpy' is
> a total non-starter.

I'm probably missing something obvious, but where would this not work?
- As the first thing it does, __init__.py imports the polyfill and
calls polyfill(__name__)
- The polyfill, if running non-recursively* under old Python:
-- compiles __init__.py
-- imports __new__.py to get __metamodule__
-- instantiates metamodule with name, and docstring from compiled code
-- * remembers the instance, to check for recursion later
-- puts it in sys.modules
-- execs __init__ in it
- afterwards the original __init__.py execution continues, filling up
a now-unused module's namespace
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to