Hi Jorge,

You can use Spark UDFs to write a function to extract the year from epoch
time. Please refer to the blog [1] for example implementation of UDFs.

@Niranda, is there any other way to achieve the same?

[1]
http://thanu912.blogspot.com/2015/08/using-user-defined-function-udf-in.html

On Fri, Sep 11, 2015 at 4:07 PM, Jorge <[email protected]> wrote:

> the question in SO:
>
> http://stackoverflow.com/questions/32531169/extract-year-from-timestamp-with-spark-sql-in-wso2-das
>
> Jorge.
>
>
> 2015-09-10 12:32 GMT-04:00 Jorge <[email protected]>:
>
>> Hi folks,
>>
>> In my hive scripts if I want to extract the year from timestamp I used
>> this:
>>
>> year(from_unixtime(cast(payload_fecha/1000 as BIGINT),'yyyy-MM-dd
>> HH:mm:ss.SSS' )) as year
>>
>> now I testing the new DAS snapshot and I want to do the same but I cannot
>> use *from_unixtime*.
>> So how can I do the same in spark SQL?
>>
>> Regards,
>>                Jorge.
>>
>
>
> _______________________________________________
> Dev mailing list
> [email protected]
> http://wso2.org/cgi-bin/mailman/listinfo/dev
>
>


-- 
Gimantha Bandara
Software Engineer
WSO2. Inc : http://wso2.com
Mobile : +94714961919
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to