Re: casting timestamp into long fail in Spark 1.3.1

2015-04-30 Thread Michael Armbrust
This looks like a bug.  Mind opening a JIRA?

On Thu, Apr 30, 2015 at 3:49 PM, Justin Yip  wrote:

> After some trial and error, using DataType solves the problem:
>
> df.withColumn("millis", $"eventTime".cast(
> org.apache.spark.sql.types.LongType) * 1000)
>
> Justin
>
> On Thu, Apr 30, 2015 at 3:41 PM, Justin Yip 
> wrote:
>
>> Hello,
>>
>> I was able to cast a timestamp into long using
>> df.withColumn("millis", $"eventTime".cast("long") * 1000)
>> in spark 1.3.0.
>>
>> However, this statement returns a failure with spark 1.3.1. I got the
>> following exception:
>>
>> Exception in thread "main" org.apache.spark.sql.types.DataTypeException:
>> Unsupported dataType: long. If you have a struct and a field name of it has
>> any special characters, please use backticks (`) to quote that field name,
>> e.g. `x+y`. Please note that backtick itself is not supported in a field
>> name.
>>
>> at
>> org.apache.spark.sql.types.DataTypeParser$class.toDataType(DataTypeParser.scala:95)
>>
>> at
>> org.apache.spark.sql.types.DataTypeParser$$anon$1.toDataType(DataTypeParser.scala:107)
>>
>> at
>> org.apache.spark.sql.types.DataTypeParser$.apply(DataTypeParser.scala:111)
>>
>> at org.apache.spark.sql.Column.cast(Column.scala:636)
>>
>> Is there any change in the casting logic which may lead to this failure?
>>
>> Thanks.
>>
>> Justin
>>
>> --
>> View this message in context: casting timestamp into long fail in Spark
>> 1.3.1
>> <http://apache-spark-user-list.1001560.n3.nabble.com/casting-timestamp-into-long-fail-in-Spark-1-3-1-tp22727.html>
>> Sent from the Apache Spark User List mailing list archive
>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>
>
>


Re: casting timestamp into long fail in Spark 1.3.1

2015-04-30 Thread Justin Yip
After some trial and error, using DataType solves the problem:

df.withColumn("millis", $"eventTime".cast(
org.apache.spark.sql.types.LongType) * 1000)

Justin

On Thu, Apr 30, 2015 at 3:41 PM, Justin Yip  wrote:

> Hello,
>
> I was able to cast a timestamp into long using
> df.withColumn("millis", $"eventTime".cast("long") * 1000)
> in spark 1.3.0.
>
> However, this statement returns a failure with spark 1.3.1. I got the
> following exception:
>
> Exception in thread "main" org.apache.spark.sql.types.DataTypeException:
> Unsupported dataType: long. If you have a struct and a field name of it has
> any special characters, please use backticks (`) to quote that field name,
> e.g. `x+y`. Please note that backtick itself is not supported in a field
> name.
>
> at
> org.apache.spark.sql.types.DataTypeParser$class.toDataType(DataTypeParser.scala:95)
>
> at
> org.apache.spark.sql.types.DataTypeParser$$anon$1.toDataType(DataTypeParser.scala:107)
>
> at
> org.apache.spark.sql.types.DataTypeParser$.apply(DataTypeParser.scala:111)
>
> at org.apache.spark.sql.Column.cast(Column.scala:636)
>
> Is there any change in the casting logic which may lead to this failure?
>
> Thanks.
>
> Justin
>
> --
> View this message in context: casting timestamp into long fail in Spark
> 1.3.1
> <http://apache-spark-user-list.1001560.n3.nabble.com/casting-timestamp-into-long-fail-in-Spark-1-3-1-tp22727.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>


casting timestamp into long fail in Spark 1.3.1

2015-04-30 Thread Justin Yip
Hello,

I was able to cast a timestamp into long using
df.withColumn("millis", $"eventTime".cast("long") * 1000)
in spark 1.3.0.

However, this statement returns a failure with spark 1.3.1. I got the
following exception:

Exception in thread "main" org.apache.spark.sql.types.DataTypeException:
Unsupported dataType: long. If you have a struct and a field name of it has
any special characters, please use backticks (`) to quote that field name,
e.g. `x+y`. Please note that backtick itself is not supported in a field
name.

at
org.apache.spark.sql.types.DataTypeParser$class.toDataType(DataTypeParser.scala:95)

at
org.apache.spark.sql.types.DataTypeParser$$anon$1.toDataType(DataTypeParser.scala:107)

at
org.apache.spark.sql.types.DataTypeParser$.apply(DataTypeParser.scala:111)

at org.apache.spark.sql.Column.cast(Column.scala:636)

Is there any change in the casting logic which may lead to this failure?

Thanks.

Justin




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/casting-timestamp-into-long-fail-in-Spark-1-3-1-tp22727.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.