Hi Isuru, Yes, it is not possible to query date directly from spark-sql. I also had a requirement to get current time when executing a query in spark ( similar to mysql NOW()) and formatting a timestamp which is in milli-second to some data format.
You can write User Defined Function(UDF) in spark-sql and use it in DAS spark environment as described in this blog [1]. [1]. http://thanu912.blogspot.com/2015/08/using-user-defined-function-udf-in.html Thanks. On Thu, Sep 24, 2015 at 1:56 PM, Isuru Wijesinghe <[email protected]> wrote: > Hi, > > I was trying to query on process and task instance data using spark-sql > which published from BPMNDataPublisher to the DAS. Here I need to convert > the date string (eg: Thu Sep 24 09:35:56 IST 2015) into one of the > datetime format that allows in spark. I think it is not possible to query > date directly from spark-sql so I'm thinking of writing a user define > function for it. > > Is it possible to write udf in spark-SQL (like in hive) and if possible > how can I import it into das to run the script? > > (Please find the attached image of the sample data which stored in the das > side) > > > -- > Isuru Wijesinghe > *Software Engineer* > Mobile: 0710933706 > [email protected] > > _______________________________________________ > Dev mailing list > [email protected] > http://wso2.org/cgi-bin/mailman/listinfo/dev > > -- Thanuja Uruththirakodeeswaran Software Engineer WSO2 Inc.;http://wso2.com lean.enterprise.middleware mobile: +94 774363167
_______________________________________________ Dev mailing list [email protected] http://wso2.org/cgi-bin/mailman/listinfo/dev
