On Mon, Dec 20, 2004 at 03:59:42AM -0800, [EMAIL PROTECTED] wrote:
> This seems to go wrong with some character encodings. I think this is
> probably because I receive the data as a byte array and the conversion
> to a string is not intelligent enough. For example if the input data is
> UTF-8, I suspect I am converting each byte of a multi-byte character
> sequence into a UTF-16 character.

Yes, nsIScriptableInputStream sucks like that...

Note that scriptableinputstream gives you what it calls a string. Which
is a problem if it contains a null byte (e.g. UTF-16), because the
string will be truncated there...

> So is there a scriptable way of converting a byte array in a known
> encoding to a correctly-decoded Javascript string? Or is there another
> way I can register as a listener to get Unicode data?

http://lxr.mozilla.org/seamonkey/source/intl/uconv/idl/nsIScriptableUConv.idl

Note that the byte array methods got added after Mozilla 1.7.x; they
are only in 1.8a5. Since nsIScriptableInputStream.read gives you a
string though, that should be ok(-ish) unless you use UTF-16, UCS-2 or
UTF-32.

If you want to read a byte array instead of a string, you can use
nsIBinaryInputStream.readByteArray.

-biesi
-- 
_______________________________________________
Mozilla-netlib mailing list
[EMAIL PROTECTED]
http://mail.mozilla.org/listinfo/mozilla-netlib

Reply via email to