On Fri, Apr 4, 2014 at 3:38 AM, Mark H Harris <harrismh...@gmail.com> wrote:
>    'Useful' must always be taken in context, and also contextually evaluated
> with an on-going methodology which constantly informs 'usefulness' on a
> continuum. I admire and encourage the core devs, in their pursuit of
> excellence. Asking 'what is the practical use case?' is essential. Not
> always is the answer complete.
>    On the python unicode continuum version (3) is more useful than version
> (2). ( this is of course relative and debatable, so the statement is
> rhetorical ) The commitment and dedicated effort to move forward with a
> unicode default is not only commendable, but also admits to the 'usefulness'
> of same. Its not that version 2 was useless, its just that version 3 is so
> much more useful that people are 'using' it and dedicating their resources
> moving forward with python3.
>    This is similar to the decimal module. Of course it had limited
> usefulness in version(2) thru 3.2/  but now, python3.3+ the decimal module
> is truly useful! Why? Glad you asked... because it is now fast enough for
> use cases previously reserved for floats. I found limited usefulness for
> decimal prior to 3.3, but now we're finding decimal so useful that some of
> us are wanting decimal to be the default. ( all of this is also relative and
> debatable )

So your definition of "useful" for the Decimal module is "fast" and
your definition of "useful" for Unicode is "mandated into use".
Neither of those is how any dictionary I know defines that word, and
you're not even consistent (since you said Unicode became useful at
3.0, which didn't - AFAIK - improve its performance any, while 3.3 did
(PEP 393)).

Here's one definition: "able to be used for a practical purpose or in
several ways". Does not say anything about performance. Python is
useful in that I am able to wield it to solve my problems. I don't
care that it's slower than C; in fact, a lot of the problems I solve
with Python are interactive, and run to completion faster than I can
notice them. If I use decimal.Decimal or fractions.Fraction in my
code, it is not because it's fast or slow or anything, it is because
that type matches what I want to do with it. Those types are useful to
me because there are situations in which they match my problem. While
I am interested in seeing a Decimal literal syntax in Python, and I
would support a shift to have "1.2" evaluate as a Decimal (but not
soon - it'd break backward compat *hugely*), I do not by any means
believe that Decimal will only become useful when it's the default
non-integer type in source code.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to