Hi folks,

In my hive scripts if I want to extract the year from timestamp I used this:

year(from_unixtime(cast(payload_fecha/1000 as BIGINT),'yyyy-MM-dd
HH:mm:ss.SSS' )) as year

now I testing the new DAS snapshot and I want to do the same but I cannot
use *from_unixtime*.
So how can I do the same in spark SQL?

Regards,
               Jorge.
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to