Hello,

I was able to cast a timestamp into long using
df.withColumn("millis", $"eventTime".cast("long") * 1000)
in spark 1.3.0.

However, this statement returns a failure with spark 1.3.1. I got the
following exception:

Exception in thread "main" org.apache.spark.sql.types.DataTypeException:
Unsupported dataType: long. If you have a struct and a field name of it has
any special characters, please use backticks (`) to quote that field name,
e.g. `x+y`. Please note that backtick itself is not supported in a field
name.

at
org.apache.spark.sql.types.DataTypeParser$class.toDataType(DataTypeParser.scala:95)

at
org.apache.spark.sql.types.DataTypeParser$$anon$1.toDataType(DataTypeParser.scala:107)

at
org.apache.spark.sql.types.DataTypeParser$.apply(DataTypeParser.scala:111)

at org.apache.spark.sql.Column.cast(Column.scala:636)

Is there any change in the casting logic which may lead to this failure?

Thanks.

Justin




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/casting-timestamp-into-long-fail-in-Spark-1-3-1-tp22727.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to