Jonathan S. Shapiro wrote:
> On Tue, Mar 9, 2010 at 6:04 PM, Aleksi Nurmi <[email protected]> wrote:
>
>> 2010/3/10 Jonathan S. Shapiro <[email protected]>:
>>
>>> Do people think that is a sensible position?
>>>
>> Honestly, I don't see a lot of arguments in favor of the 16-bit char,
>> there. :-) There's the interop thing, and well... a 16-bit char has no
>> other use: it doesn't represent anything meaningful, it's just a
>> uint16. To satisfy interop requirements, adding a separate type for
>> 16-bit code units seems by far the most sensible thing to do, and I
>> don't see any real downsides. Interoperation between BitC and CTS
>> isn't going to be straightforward in any case.
>>
> Actually, that was my initial reaction, but it does have the
> consequence that it pushes me into rebuilding the text library early.
> That's something we need to do, but it would be nice to do it
> incrementall
Not sure if this matters but there's at least one magic property of
[MSCorlib]System.String which I think also applies to the JVM's String:
there is a guarantee that string literals (which have type
System.String) will be interned by the runtime and so can be compared
via eq (and the instance method String.Intern() is also mildly but
similarly magic).
It seems to me like interoperability is a compelling reason to use the
runtime-provided strings, appropriately wrapped and tamed. Otherwise
you'll end up allocating and copying strings all over the place at the
BitC <--> {CLI, JVM} interface.
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev