On Saturday, August 16, 2014 3:56:45 PM UTC-4, Mike Fikes wrote:
> I am accessing a Boolean field on a JavaScript object and if the host is 
> 64-bit JavaScriptCore I will get back either a true or false ClojureScript 
> value. If it is instead a 32-bit JavaScriptCore host, I will get back either 
> a 0 or 1 ClojureScript value.
> 
> The context of all of this is JavaScriptCore embedded in an iOS app, but I'm 
> hoping that perhaps someone with ClojureScript driving Safari may have 
> encountered this. Alternatively, perhaps I've uncovered a bug somewhere in 
> the stack.
> 
> More specifics of the environment where I'm encountering this:
> 
> The JavaScript object is actually an Objective-C object injected into the 
> JavaScriptCore execution context, where the Objective-C object has an 
> exported BOOL property:
> 
>   @protocol MyObject <JSExport>
>   @property BOOL foo;
>   @end
> 
>   @interface MyObject : NSObject<MyObject>
>   ...
>   @end
> 
> I access this property's value using the (.-foo my-object) syntax, and if I 
> print the value in the REPL I will see true or false if the iOS app is 
> running on a 64-bit device or in the 64-bit simulator, and I will see 0 or 1 
> if the iOS app is running on a 32-bit device or in the 32-bit simulator.
> 
> Trying to dig deeper, I can see the following when in the 64-bit host:
> 
>   (type (.-foo my-object))
>   #<function Boolean() {
>       [native code]
>   }>
>   => nil
> 
> and the following when in the 32-bit host:
> 
>   (type (.-foo my-object))
>   #<function Number() {
>       [native code]
>   }>
>   => nil
> 
> Of course, without fixing the underlying issue, I must handle the return 
> values specially, as 0 is not falsey.
> 
> My hope is that someone will say, "Ahh yeah, we've seen that with 32-bit 
> Safari, and here's what we did to handle it." Otherwise, I'm gonna dig deeper.
> 
> Thanks!

I'm betting this is caused by the Objective-C runtime. Under the 32-bit 
runtime, Objective-C's BOOL is actually a typedef and desugars to unsigned 
char. Under the 64-bit runtime, BOOL instead desugars to _Bool – a distinct 
type rather than a typedef. This behavior, and some of the problems it can 
cause, is documented here: 
http://arigrant.com/blog/2014/1/31/a-tiny-bit-of-objective-c-bools

The 32-bit ObjC runtime (and, therefore, JavaScriptCore when used with a 32-bit 
host) is probably unable to distinguish boolean values from numbers, and 
therefore treats ObjC YES and NO as 1 and 0 respectively when converting ObjC 
objects to JS ones. On 64-bit runtimes, a distinction can be made, so JSCore 
correctly converts ObjC YES and NO to JavaScript true and false instead.

Seems like explicitly treating 0 as falsey when reading boolean fields from 
ObjC objects is about all you can do.

-- 
Note that posts from new members are moderated - please be patient with your 
first post.
--- 
You received this message because you are subscribed to the Google Groups 
"ClojureScript" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/clojurescript.

Reply via email to