On 10/18/2011 02:41 PM Armin Rigo wrote:
Hi,
On Tue, Oct 18, 2011 at 14:19, Stefan Behnel<stefan...@behnel.de> wrote:
The other situation is where PyPy does its own thing and supports some NumPy
code that happens to run faster than in CPython, while other code does not
work at all, with the possibility to replace it in a PyPy specific way.
I think you are disregarding what 8 years of the PyPy project should
have made obvious. Yes, some code will not run at all on PyPy at
first, and that amount of code is going to be reduced over time. But
what we want is to avoid a community split, so we are never, ever,
going to add and advertise PyPy-only ways to write programs.
A bientôt,
Armin.
Just the same, I think PyPy could be allowed to have an "import that" ;-)
I think I read somewhere that PyPy's ambitions were not just to serve official
Python with
fast implementation, but possibly other language development too. Is that still
true?
If one wanted to use PyPy's great infrastructure to implement a new language,
what would be the pypythonic bootstrapping path towards a self-hosting new
language?
BTW, would self-hosting be seen as a disruptive forking goal?
If the new language were just a tweak on Python, what would be the attitude
towards
starting with a source-source preprocessor followed by invoking pypy on the
result?
Just as an example (without looking at grammar problems for the moment),
what if I wanted to transform just assignments such that x:=expr meant x=expr
as usual,
except produced source mods to tell pypy that subsequently it could assume that
x had
unchanging type (which it might be able to infer anyway in most cases, but it
would also
make programmer intent human-perceptible). In similar vein, x::=expr could mean
x's
value (and type) would be frozen after the first evaluation.
This raises the question of how would I best tell pypy this
meta-information about x with legal manual edits now? Assertions?
Could you see pypy accepting command-line options with information about
specific variables,
functions, modules, etc.? (such options could of course be collected in a file
referenced
from a command line option). It is then a short step to create some linkage
between the
source files and the meta-data files, e.g. by file name extensions like .py and
.pyc
for the same file. Maybe .pym? If so pypy could look for .pym the way it looks
for
.pyc at the appropriate time.
Going further, what about preprocessing with a statement decorator using e.g.,
@@decomodule.fun
statement # with suite
to mean the preprocessor at read time should import decomodule in a special
environment
and pass the statement (with suite) to decomodule.fun for source transformation
with return
of source to be substituted. BTW, during the course of preprocessing, the "special
environment" for
importing statement-decorating modules could persist, and state from early
decorator calls
could affect subsequent ones.
Well, this would be Python with macros in a sense, so it would presumably be
too disruptive to
be considered for any pythonic Python. OTOH, it might be an interesting path
for some variations
to converge under one sort-of-pythonic (given the python decorator as precedent)
source-transformation markup methodology. (I'm just looking for reaction to the
general idea, not specific syntax problems).
Regards,
Bengt Richter
_______________________________________________
pypy-dev mailing list
pypy-dev@python.org
http://mail.python.org/mailman/listinfo/pypy-dev