Hello,

I'm got a trouble with float type coercion on SparkR with hiveContext.

> result <- sql(hiveContext, "SELECT offset, percentage from data limit 100")

> show(result)
DataFrame[offset:float, percentage:float]

> head(result)
Error in as.data.frame.default(x[[i]], optional = TRUE) :
    cannot coerce class ""jobj"" to a data.frame


This trouble looks like already exists (SPARK-2863 - Emulate Hive type
coercion in native reimplementations of Hive functions) with same
reason - not completed "native reimplementations of Hive..." not
"...functions" only.

So, anybody met in this issue before? And, how can I test it more
precisely if it not looks like a bug?

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to