Hi Yin

With HiveContext works well.

Thanks!!!

Regars.
Miguel Angel.



On Fri, Mar 13, 2015 at 3:18 PM, Yin Huai <yh...@databricks.com> wrote:

> Are you using SQLContext? Right now, the parser in the SQLContext is quite
> limited on the data type keywords that it handles (see here
> <https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SqlParser.scala#L391>)
> and unfortunately "bigint" is not handled in it right now. We will add
> other data types in there (
> https://issues.apache.org/jira/browse/SPARK-6146 is used to track it).
> Can you try HiveContext for now?
>
> On Fri, Mar 13, 2015 at 4:48 AM, Masf <masfwo...@gmail.com> wrote:
>
>> Hi.
>>
>> I have a query in Spark SQL and I can not covert a value to BIGINT:
>> CAST(column AS BIGINT) or
>> CAST(0 AS BIGINT)
>>
>> The output is:
>> Exception in thread "main" java.lang.RuntimeException: [34.62] failure:
>> ``DECIMAL'' expected but identifier BIGINT found
>>
>> Thanks!!
>> Regards.
>> Miguel Ángel
>>
>
>


-- 


Saludos.
Miguel Ángel

Reply via email to