On 12/13/19 12:27 AM, Steven D'Aprano wrote:
> On Wed, Dec 11, 2019 at 10:30:15PM -0500, Richard Damon wrote:
>
>> I will way that in my many years of
>> programming experience I can't think of any great cases where a language
>> as part of the language definition limited to 'size' of a program to
>> good effect,
> Good thing that's not what the PEP proposes then :-)
>
> The PEP talks about CPython implementation, not hard limits on all 
> Python compilers/interpreters.

Not the way I read the PEP.

It talks about changing the language.

Under 'Enforcement' it admits it can't 'force' other implementations but
says other implementations should generate the same errors unless doing
so hurts performance, so it is a language change, not just limits for a
given implementation.

As I said originally, I have no problem with the idea of creating a
variant implementation of CPython with this sort or documented limits to
demonstrate that it does provide a real benefit. We would then be in a
good place to determine the real costs and benefits, and one of the
branches might just wither and die because it isn't useful enough anymore.

>
>> and generally such limits relegate a language into being
>> seen as a 'toy language'. 
> The designers of C recognised that, in practice, compilers will have 
> limits. Rather than demanding "NO ARBITRARY LIMITS!!!" they specified 
> *minimum* levels for compliant compilers. Often quite low limits.
>
> Actual compilers often impose their own limits:
>
> https://gcc.gnu.org/onlinedocs/cpp/Implementation-limits.html
>
> https://www.cs.auckland.ac.nz/references/unix/digital/AQTLTBTE/DOCU_012.HTM
Yes, as I said above, there is a big definition between an
implementation of a language documenting some of its limits and the
language definition itself limiting what the language can do.
>
> So if Mark's proposal relegates Python to a "toy language", we'll be in 
> good company with other "toys" that have implementation, or even 
> language, limits:
>
>
> https://stackoverflow.com/questions/5689798/why-does-java-limit-the-size-of-a-method-to-65535-byte

There is a big difference between limiting the size of a single method
and limiting the total number of classes allowed in a program. The first
can be largely gotten around by refactoring the method to be implemented
in multiple methods, the latter can't be.

> https://web.archive.org/web/20160304023522/http://programmers.stackexchange.com/questions/108699/why-does-javas-collection-size-return-an-int
Yes, Java made a decision early in its life cycle to lock itself into
fixed sized types, and
> https://www.sqlite.org/limits.html
These are a very different type of limits. These are defines that the
programmer can change to establish what the various limits are. They can
be increased or decreased as desired by the programmer (with natural
upper limits based on the size of certain fundamental types).
>
>
> (If you read only one of those five links, please read the last.)
>
>
>> The biggest issue is that computers are
>> growing more powerful every day, and programs follow in getting bigger,
>> so any limit that we think of as more than sufficient soon becomes too
>> small (No one will need more than 640k of RAM).
> The beauty of this proposal is that since its an implementation limit, 
> not a law of nature, if and when computers get more powerful and 
> machines routinely have multiple zettabyte memories *wink* we can always 
> update the implementation.
>
> I'm not entirely being facetious here. There's a serious point. Unlike 
> languages like C and Java, where changes have to go through a long, 
> slow, difficult process, we have a much more agile process. If the PEP 
> is accepted, that doesn't mean we're locked into that decision for life. 
> Relaxing limits in the future doesn't break backwards compatibility.
>
> "What if computers get more powerful? Our limits will be obsolete!!!" 
> Naturally. Do you still expect to be using Python 3.9 in ten years time 
> with your fancy new uber-hyper-quantum ten thousand gigabyte computer? 
> Probably not. As hardware grows, and our needs grow, so can the 
> hypothetical limits.
>
> What is valuable are *concrete*, actual (not theoretical) examples of 
> where Mark's proposed limits are too low, so that we can get a realistic 
> view of where the potential tradeoffs lie:
>
> * lose this much functionality (code that did run, or might have run, 
>   but that won't run under the PEP)
>
> * in order to gain this much in safety, maintainability, efficiency.
>
>
> And before people jump down my throat again, I've already said -- on 
> multiple occassions -- that the onus is on Mark to demonstrate the 
> plausibility of any such gains.
>
>
> Thank you for reading :-)
>
As I said, the proposal as listed on Python.org is a language change,
not a proposal for an implementation that has a set of limits.

To do the former, really needs much more evidence to make it a
reasonable course of action, the latter, if really done on an
independent fork probably doesn't even really need a PEP, as because
Python is open source, taking the existing code base, cloning it, and
making the changes is something anyone is allowed to do. It just won't
be the 'official C-Python distribution'. It does mean that the proposer
needs to do (or find someone to do) the work.

What might make sense through a PEP would be see if it would be
reasonable to make a common code base variant, which might make sense if
most of the code base can be untouched, and there are just isolated
chunks of code that need to vary (structure definitions and limited
setup code).

My first feeling is that it is enough of a change, with unproven
results, that it doesn't really make sense to commit to it for the main
release without doing a test version to see what gains can be gotten,
thus the idea of a 'second' version with the limits built in.


Maybe part of the issue is many people think Python IS CPython and
CPython is Python, and thus to propose an idea for changing CPython
would seem to be a change in the base Python specification. I would also
suggest that the presence of the 'Reference Implementation' section in
the PEP format suggests that what is expected for things like this is
that the proposer does fork the current implementation, makes at least a
sample of the proposed changes and provides that as a demonstration of
the advantages of the proposal.

Perhaps one big disadvantage to how this PEP is written, is it presumes
that we first add restrictions to the various limits in Python starting
in version 3.9, but they are phased in, so we can't actually depend on
them being present till 3.12 (several years later) so we are very
limited in what performance we can gain for years. Making it a fork says
we can apply the restrictions NOW, if a given program doesn't work under
the limits, it just needs to be run under the unlimited version of
Python, which will still be available. We also can make a direct apples
to apples comparison of programs that do run under the improvements we
can make from the limits.


-- 
Richard Damon
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/JW6F2ETPLBVTJZPJ53DZCBCSWY5FMS5L/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to