I'm betting this is caused by the Objective-C runtime. Under the 32-bit 
runtime, Objective-C's BOOL is actually a typedef and desugars to unsigned 
char. Under the 64-bit runtime, BOOL instead desugars to _Bool – a distinct 
type rather than a typedef. This behavior, and some of the problems it can 
cause, is documented here: 
http://arigrant.com/blog/2014/1/31/a-tiny-bit-of-objective-c-bools 

The 32-bit ObjC runtime (and, therefore, JavaScriptCore when used with a 32-bit 
host) is probably unable to distinguish boolean values from numbers, and 
therefore treats ObjC YES and NO as 1 and 0 respectively when converting ObjC 
objects to JS ones. On 64-bit runtimes, a distinction can be made, so JSCore 
correctly converts ObjC YES and NO to JavaScript true and false instead. 

Seems like explicitly treating 0 as falsey when reading boolean fields from 
ObjC objects is about all you can do.

-- 
Note that posts from new members are moderated - please be patient with your 
first post.
--- 
You received this message because you are subscribed to the Google Groups 
"ClojureScript" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojurescript+unsubscr...@googlegroups.com.
To post to this group, send email to clojurescript@googlegroups.com.
Visit this group at http://groups.google.com/group/clojurescript.

Reply via email to