Hi All,

I want to create a CFString using function CFStringCreateWithBytes.

CFStringRef CFStringCreateWithBytes (
   CFAllocatorRef alloc,
   const UInt8 *bytes,
   CFIndex numBytes,
   CFStringEncoding encoding,
   Boolean isExternalRepresentation
);

I suspect, the "encoding" parameter refers to the encoding of the source string.

My source buffer containing the string can be encoded in UTF-16LE or UTF-16BE.
I don't want to have a BOM in the resulting CFString - and the source buffer 
does not contain it either.

So, I would create it like:
CFStringRef str = CFStringCreateWithBytes(NULL, buffer, length, encoding, 
false);

where "encoding" corresponds to either UTF-16LE or UTF-16-BE of the source. 
That is, parameter encoding equals either kCFStringEncodingUTF16LE or 
kCFStringEncodingUTF16BE respectively.

The documentation does not tell me which source encoding would be the most 
preferred to initialize the CFString in the most efficient manner. I would 
guess this is UTF-16LE on Intel machines.

So, the question is, which one would be the most efficient - and how can I 
figure this out at compile- /runtime?
(I know how to figure the endianness of the machine)

And what happens if I just specify kCFStringEncodingUTF16 ?  Is then the source 
encoding assumed to be in host endianness? Or UTF-16BE as the Unicode Standard 
suggests?

Thanks for help!


Regards
Andreas
 


_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to