(re-sent from hopefully correct address)

20 авг. 2013 г., в 7:09, Anne van Kesteren <[email protected]> написал(а):

> On Tue, Aug 20, 2013 at 12:30 AM, Ryosuke Niwa <[email protected]> wrote:
>> Can the specification be changed to use the number of composed character 
>> sequences instead of the code-unit length?
> 
> In a way I guess that's nice, but it also seems confusing that given
> 
> data:text/html,<input type=text maxlength=1>
> 
> pasting in U+0041 U+030A would give a string that's longer than 1 from
> JavaScript's perspective. I don't think there's any place in the
> platform where we measure string length other than by number of code
> units at the moment.

FWIW, this is tracked for WebKit as 
<https://bugs.webkit.org/show_bug.cgi?id=120030>.

I agree with Darin's comment in that the standard should consider end user 
concepts more strongly here. WebKit had this more humane behavior for many 
years, so we know that it's compatible with the Web, and there is no need to 
chase the lowest common denominator.

Additionally, there are features in the platform that work with Unicode 
grapheme clusters perfectly, and I think that these are closely connected to 
maxLength. Namely, editing functionality understands grapheme clusters very 
well, so you can change selections by moving caret right or left one 
"character", and so forth. Web sites frequently perform some editing on the 
text as you type it.

- WBR, Alexey Proskuryakov

Reply via email to