On Sun, Jun 29, 2008 at 12:36 PM, Antoine Pitrou <[EMAIL PROTECTED]> wrote:
>
> Indeed. On the other hand it already works properly for ints and floats,
> so perhaps Decimal shouldn't refuse unicode digits like it currently
> does:

Maybe.  The IBM standard doesn't seem to say whether other Unicode
digits should be accepted or not.

Is there a quick way to convert a general Unicode digit to its
ascii equivalent?  Having to run str(int(c)) on each numeric character
sounds painful, and the Decimal constructor doesn't need to
be any slower right now.

In any case, this potential problem with decimal has now been
identified, and is easy to deal with.  I'm more worried, perhaps
needlessly, about what other unidentified problems might be
lurking deep in the standard library.  Any use of '\d', '\w', '\s', etc.
might potentially be a problem.

Mark
_______________________________________________
Python-3000 mailing list
Python-3000@python.org
http://mail.python.org/mailman/listinfo/python-3000
Unsubscribe: 
http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com

Reply via email to