Hi Jorge and Gimantha,
UDF's is the way to go. Please go through that blogpost Gimantha has
posted.
we will be updating the docs on UDF's in DAS soon.
rgds
On Sat, Sep 12, 2015 at 10:23 AM, Gimantha Bandara
wrote:
> Hi Jorge,
>
> You can use Spark UDFs to write a function
the question in SO:
http://stackoverflow.com/questions/32531169/extract-year-from-timestamp-with-spark-sql-in-wso2-das
Jorge.
2015-09-10 12:32 GMT-04:00 Jorge :
> Hi folks,
>
> In my hive scripts if I want to extract the year from timestamp I used
> this:
>
>
Hi Jorge,
You can use Spark UDFs to write a function to extract the year from epoch
time. Please refer to the blog [1] for example implementation of UDFs.
@Niranda, is there any other way to achieve the same?
[1]
http://thanu912.blogspot.com/2015/08/using-user-defined-function-udf-in.html
On
Hi folks,
In my hive scripts if I want to extract the year from timestamp I used this:
year(from_unixtime(cast(payload_fecha/1000 as BIGINT),'-MM-dd
HH:mm:ss.SSS' )) as year
now I testing the new DAS snapshot and I want to do the same but I cannot
use *from_unixtime*.
So how can I do the