Re: [Python-Dev] PEP520 and absence of __definition_order__

2016-09-10 Thread Russell Keith-Magee
On Sun, Sep 11, 2016 at 11:05 AM, Nick Coghlan  wrote:

> On 11 September 2016 at 07:26, Guido van Rossum  wrote:
> > On Sat, Sep 10, 2016 at 10:57 AM, Nick Coghlan 
> wrote:
> >> On 11 September 2016 at 03:08, Guido van Rossum 
> wrote:
> >>> So I'm happy to continue thinking about this, but I expect this is not
> >>> such a big deal as you fear. Anyway, let's see if someone comes up
> >>> with a more convincing argument by beta 2!
> >>
> >> For CPython specifically, I don't have anything more convincing than
> >> Ethan's Enum example (where the way the metaclass works means most of
> >> the interesting attributes don't live directly in the class dict, they
> >> live in private data structures stored in the class dict, making
> >> "list(MyEnum.__dict__)" inherently uninteresting, regardless of
> >> whether it's ordered or not).
> >
> > But that would only matter if we also defined a helper utility that
> > used __definition_order__. I expect that the implementation of Enum
> > could be simplified somewhat in Python 3.6 since it can trust that the
> > namespace passed into __new__ is ordered (so it doesn't have to switch
> > it to an OrderedDict in __prepare__, perhaps).
> >
> > In any case the most likely way to use __definition_order__ in general
> > was always to filter its contents through some other condition (e.g.
> > "isn't a method and doesn't start with underscore") -- you can do the
> > same with keys(). Classes that want to provide a custom list of
> > "interesting" attributes can provide that using whatever class method
> > or attribute they want -- it's just easier to keep those attributes
> > ordered because the namespace is always ordered.
>
> For example,it's already possible to expose order information via
> __dir__, consumers of the information just have to bypass the implicit
> sorting applied by the dir() builtin:
>
>   >>> class Example:
>   ... def __dir__(self):
>   ... return "first second third fourth".split()
>   ...
>   >>> dir(Example())
>   ['first', 'fourth', 'second', 'third']
>   >>> Example().__dir__()
>   ['first', 'second', 'third', 'fourth']
>
> You've persuaded me that omitting __definition_order__ is the right
> thing to do for now, so the last thing I'm going to do is to
> explicitly double check with the creators of a few interesting
> alternate implementations (MicroPython, VOC for JVM environments,
> Batavia for JavaScript environments) to see if this may cause them
> problems in officially implementing 3.6 (we know PyPy will be OK,
> since they did it first).
>
> VOC & Batavia *should* be OK (worst case, they return
> collections.OrderedDict from __prepare__ and also use it for __dict__
> attributes), but I'm less certain about MicroPython (since I don't
> know enough about how its current dict implementation works to know
> whether or not they'll be able to make the same change PyPy and
> CPython did)
>

>From the perspective of VOC and Batavia: As Nick notes, there may be some
changes needed to use OrderDict (or a native analog) in a couple of places,
but other than that, it doesn’t strike me as a change that will pose any
significant difficulty.

Yours,
Russ Magee %-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Bug in build system for cross-platform builds

2016-03-11 Thread Russell Keith-Magee
On Sat, Mar 12, 2016 at 6:38 AM, Martin Panter <vadmium...@gmail.com> wrote:

> Hi Russell. Sorry for the minor ~1 month delay in replying :)
>
> I have been doing some experimenting to see what is involved in
> cross-compiling Python (Native host = Linux, target = Windows via
> mingw and some patches). So I have a slightly better understanding of
> the problem than before.
>
> On 16 February 2016 at 01:41, Russell Keith-Magee
> <russ...@keith-magee.com> wrote:
> > In order to build for a host platform, you have to compile for a local
> > platform first - for example, to compile an iOS ARM64 binary, you have to
> > compile for OS X x86_64 first. This gives you a local platform version of
> > Python you can use when building the iOS version.
> >
> > Early in the Makefile, the variable PYTHON_FOR_BUILD is set. This points
> at
> > the CPU-local version of Python that can be invoked, which is used for
> > module builds, and for compiling the standard library source code. This
> is
> > set by —host and —build flags to configure, plus the use of CC and
> LDFLAGS
> > environment variables to point at the compiler and libraries for the
> > platform you’re compiling for, and a PATH variable that provides the
> local
> > platform’s version of Python.
>
> So far I haven’t succeeded with my Min GW cross build and am
> temporarily giving up due to incompatibilities. But my attempts looked
> a bit like this:
>
> make clean  # Work around confusion with existing in-source build
> mkdir native
> (cd native/ && ../configure)
> make -C native/ Parser/pgen
> mkdir mingw
> (cd mingw/ && ../configure --host=i486-mingw32 --build=x86)
> make -C mingw/ PGEN=../native/Parser/pgen
>
> Actually it was not as smooth as the above commands, because pgen
> tends to get overwritten with a cross-compiled version. Perhaps we
> could add a PGEN_FOR_BUILD override, like HOSTPGEN in the patch used
> at <
> https://wayback.archive.org/web/20160131224915/http://randomsplat.com/id5-cross-compiling-python-for-embedded-linux.html
> >.
>
> That might fix the pgen problem,  but _freeze_importlib still remains. I
suppose the same thing might be possible for _freeze_importlib as well…

> There are two places where special handling is required: the compilation
> and
> > execution of the parser generator, and _freeze_importlib. In both cases,
> the
> > tool needs to be compiled for the local platform, and then executed.
> > Historically (i.e., Py3.4 and earlier), this has been done by spawning a
> > child MAKE to compile the tool; this runs the compilation phase with the
> > local CPU environment, before returning to the master makefile and
> executing
> > the tool. By spawning the child MAKE, you get a “clean” environment, so
> the
> > tool is built natively. However, as I understand it, it causes problems
> with
> > parallel builds due to race conditions on build rules. The change in
> > Python3.5 simplified the rule so that child MAKE calls weren’t used, but
> > that means that pgen and _freeze_importlib are compiled for ARM64, so
> they
> > won’t run on the local platform.
>
> You suggest that the child Make command happened to compile pgen etc
> natively, rather than with the cross compiler. But my understanding is
> that when you invoke $(MAKE), all the environment variables, configure
> settings, etc, including the cross compiler, would be inherited by the
> child.
>
> Would it be more correct to say instead that in 3.4 you did a separate
> native build step, precompiling pgen and _freeze_importlib for the
> native build host? Then you hoped that the child Make was _not_
> invoked in the cross-compilation stage and your precompiled
> executables would not be rebuilt?
>

Yes - as far as I can make out (with my admittedly hazy understanding),
that appears to be what is going on. Although it’s not that I “hoped” the
build wouldn’t happen on the second pass - it was the behavior that was
previously relied, and on was altered.


> > As best as I can work out, the solution is to:
> >
> > (1) Include the parser generator and _freeze_importlib as part of the
> > artefacts of local platform. That way, you could use the version of pgen
> and
> > _freeze_importlib that was compiled as part of the local platform build.
> At
> > present, pgen and _freeze_importlib are used during the build process,
> but
> > aren’t preserved at the end of the build; or
>
> I don’t understand. After I run Make, it looks like I get working
> executables leftover at Programs/_freeze_importlib and Parser/pgen. Do
> you mean to install these programs with “make install” or something?
>

Re: [Python-Dev] Bug in build system for cross-platform builds

2016-02-15 Thread Russell Keith-Magee
On Tue, Feb 16, 2016 at 5:22 AM, Martin Panter <vadmium...@gmail.com> wrote:

> On 15 February 2016 at 08:24, Russell Keith-Magee
> <russ...@keith-magee.com> wrote:
> > Hi all,
> >
> > I’ve been working on developing Python builds for mobile platforms, and
> I’m
> > looking for some help resolving a bug in Python’s build system.
> >
> > The problem affects cross-platform builds - builds where you are
> compiling
> > python for a CPU architecture other than the one on the machine that is
> > doing the compilation. This requirement stems from supporting mobile
> > platforms (iOS, Android etc) where you compile on your laptop, then ship
> the
> > compiled binary to the device.
> >
> > In the Python 3.5 dev cycle, Issue 22359 [1] was addressed, fixing
> parallel
> > builds. However, as a side effect, this patch broke (as far as I can
> tell)
> > *all* cross platform builds. This was reported in issue 22625 [2].
> >
> > Since that time, the problem has gotten slightly worse; the addition of
> > changeset 95566 [3] and 95854 [4] has cemented the problem. I’ve been
> able
> > to hack together a fix that enables me to get a set of binaries, but the
> > patch is essentially reverting 22359, and making some (very dubious)
> > assumptions about the order in which things are built.
> >
> > Autoconf et al aren’t my strong suit; I was hoping someone might be able
> to
> > help me resolve this issue.
>
> Would you mind answering my question in
> <https://bugs.python.org/issue22625#msg247652>? In particular, how did
> cross-compiling previously work before these changes. AFAIK Python
> builds a preliminary Python executable which is executed on the host
> to complete the final build. So how do you differentiate between host
> and target compilers etc?
>

In order to build for a host platform, you have to compile for a local
platform first - for example, to compile an iOS ARM64 binary, you have to
compile for OS X x86_64 first. This gives you a local platform version of
Python you can use when building the iOS version.

Early in the Makefile, the variable PYTHON_FOR_BUILD is set. This points at
the CPU-local version of Python that can be invoked, which is used for
module builds, and for compiling the standard library source code. This is
set by —host and —build flags to configure, plus the use of CC and LDFLAGS
environment variables to point at the compiler and libraries for the
platform you’re compiling for, and a PATH variable that provides the local
platform’s version of Python.

There are two places where special handling is required: the compilation
and execution of the parser generator, and _freeze_importlib. In both
cases, the tool needs to be compiled for the local platform, and then
executed. Historically (i.e., Py3.4 and earlier), this has been done by
spawning a child MAKE to compile the tool; this runs the compilation phase
with the local CPU environment, before returning to the master makefile and
executing the tool. By spawning the child MAKE, you get a “clean”
environment, so the tool is built natively. However, as I understand it, it
causes problems with parallel builds due to race conditions on build rules.
The change in Python3.5 simplified the rule so that child MAKE calls
weren’t used, but that means that pgen and _freeze_importlib are compiled
for ARM64, so they won’t run on the local platform.

As best as I can work out, the solution is to:

(1) Include the parser generator and _freeze_importlib as part of the
artefacts of local platform. That way, you could use the version of pgen
and _freeze_importlib that was compiled as part of the local platform
build. At present, pgen and _freeze_importlib are used during the build
process, but aren’t preserved at the end of the build; or

(2) Include some concept of the “local compiler” in the build process,
which can be used to compile pgen and _freeze_importlib; or

There might be other approaches that will work; as I said, build systems
aren’t my strength.

Yours,
Russ Magee %-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Bug in build system for cross-platform builds

2016-02-15 Thread Russell Keith-Magee
Hi all,

I’ve been working on developing Python builds for mobile platforms, and I’m
looking for some help resolving a bug in Python’s build system.

The problem affects cross-platform builds - builds where you are compiling
python for a CPU architecture other than the one on the machine that is
doing the compilation. This requirement stems from supporting mobile
platforms (iOS, Android etc) where you compile on your laptop, then ship
the compiled binary to the device.

In the Python 3.5 dev cycle, Issue 22359 [1] was addressed, fixing parallel
builds. However, as a side effect, this patch broke (as far as I can tell)
*all* cross platform builds. This was reported in issue 22625 [2].

Since that time, the problem has gotten slightly worse; the addition of
changeset 95566 [3] and 95854 [4] has cemented the problem. I’ve been able
to hack together a fix that enables me to get a set of binaries, but the
patch is essentially reverting 22359, and making some (very dubious)
assumptions about the order in which things are built.

Autoconf et al aren’t my strong suit; I was hoping someone might be able to
help me resolve this issue.

Yours,
Russ Magee %-)

[1] http://bugs.python.org/issue22359
[2] http://bugs.python.org/issue22625
[3] https://hg.python.org/cpython/rev/565b96093ec8
[4] https://hg.python.org/cpython/rev/02e3bf65b2f8
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com