I filed a rdar, and Apple engineering confirmed the inherent limitation with
the 32-bit runtime:
“Objective-C internally treats BOOL as int, so we can't do anything nicer.”
--
Note that posts from new members are moderated - please be patient with your
first post.
---
You received this
On Saturday, August 16, 2014 3:56:45 PM UTC-4, Mike Fikes wrote:
I am accessing a Boolean field on a JavaScript object and if the host is
64-bit JavaScriptCore I will get back either a true or false ClojureScript
value. If it is instead a 32-bit JavaScriptCore host, I will get back either
a
I'm betting this is caused by the Objective-C runtime. Under the 32-bit
runtime, Objective-C's BOOL is actually a typedef and desugars to unsigned
char. Under the 64-bit runtime, BOOL instead desugars to _Bool – a distinct
type rather than a typedef. This behavior, and some of the problems it
Thanks Max! The unsigned char / _Bool type distinction is probably the root
cause.
I did a little digging and found that exported BOOL properties are _supposed_
to be handled correctly. JSExport.h indicates
BOOL: values are converted consistently with valueWithBool/toBool.
where