Martin v. Löwis wrote: >> For binary representations, we already have the struct module to handle >> the parsing, but for byte sequences with embedded ASCII digits it's >> reasonably common practice to use strings along with the respective type >> constructors. > > Sure, but why can't you write > > foo = int(bar[start:stop].decode("ascii")) > > then? Explicit is better than implicit.
Yeah, this thread has convinced me that it would be better to start rejecting bytes in int() and float() as well rather than implicitly assuming an ASCII encoding. If we decide the fast path for ASCII is still important (e.g. to solve 3.0's current speed problems in decimal), then it would be better to add separate methods to int to expose the old 2.x str->int and int->str optimisations (e.g. an int.from_ascii class method and an int.to_ascii instance method). Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --------------------------------------------------------------- http://www.boredomandlaziness.org _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com