Thomas Powell created SPARK-17608:

             Summary: Long type has incorrect serialization/deserialization
                 Key: SPARK-17608
             Project: Spark
          Issue Type: Bug
          Components: SparkR
    Affects Versions: 2.0.0
            Reporter: Thomas Powell

Am hitting issues when using {{dapply}} on a data frame that contains a 
{{bigint}} in its schema. When this is converted to a SparkR data frame a 
"bigint" gets converted to a R {{numeric}} type:

However, the R {{numeric}} type gets converted to 

The two directions therefore aren't compatible. If I use the same schema when 
using dapply (and just an identity function) I will get type collisions because 
the output type is a double but the schema expects a bigint. 

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to