[ 
https://issues.apache.org/jira/browse/SPARK-11167?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14970304#comment-14970304
 ] 

Felix Cheung commented on SPARK-11167:
--------------------------------------

Shouldn't it try to infer the tightest type that could accommodate the 
heterogeneous values? This is the strategy used by spark csv package, for 
example, I'm sure there are others:

https://github.com/databricks/spark-csv/blob/master/src/main/scala/com/databricks/spark/csv/util/InferSchema.scala


> Incorrect type resolution on heterogeneous data structures
> ----------------------------------------------------------
>
>                 Key: SPARK-11167
>                 URL: https://issues.apache.org/jira/browse/SPARK-11167
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.6.0
>            Reporter: Maciej Szymkiewicz
>
> If structure contains heterogeneous incorrectly assigns type of the 
> encountered element as type of a whole structure. This problem affects both 
> lists:
> {code}
> SparkR:::infer_type(list(a=1, b="a")
> ## [1] "array<double>"
> SparkR:::infer_type(list(a="a", b=1))
> ##  [1] "array<string>"
> {code}
> and environments:
> {code}
> SparkR:::infer_type(as.environment(list(a=1, b="a")))
> ## [1] "map<string,double>"
> SparkR:::infer_type(as.environment(list(a="a", b=1)))
> ## [1] "map<string,string>"
> {code}
> This results in errors during data collection and other operations on 
> DataFrames:
> {code}
> ldf <- data.frame(row.names=1:2)
> ldf$foo <- list(list("1", 2), list(3, 4))
> sdf <- createDataFrame(sqlContext, ldf)
> collect(sdf)
> ## 15/10/17 17:58:57 ERROR Executor: Exception in task 0.0 in stage 9.0 (TID 
> 9)
> ## scala.MatchError: 2.0 (of class java.lang.Double)
> ## ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to