On Mon, Aug 10, 2015 at 9:58 PM, Thomas 'PointedEars' Lahn <[email protected]> wrote: > Chris Angelico wrote: > >> There's another thing you absolutely have to know when you do web >> development, and that's i18n. This is why I don't recommend Node.js >> for server-side work - because Python's Unicode support is better than >> JS's. > > Your posting shows again that your knowledge of "JS" is full of gaps at > best, so you should refrain from making bold statements about "it" and > making design decisions and recommendations based on that.
Do please enlighten me! Tell me how Node changes the underlying ECMAScript specification and gives true Unicode support. In particular, I would expect the length of a string to be based on either code points or combining character sequences, and indexing (including slicing) should be based on the same thing. It should not be possible to split a character during iteration over a string. (Whether you iterate over "é" as a single character or as two (U+0065 U+0301) is a matter of debate, and I'd accept both answers as correct.) ChrisA -- https://mail.python.org/mailman/listinfo/python-list
