Re: May I ask a question about SparkSql

2016-04-08 Thread Kasinathan, Prabhu
Check this one. https://github.com/Intel-bigdata/spark-streamingsql. We tried 
it and it was working with Spark 1.3.1. You can do ETL on Spark Streaming 
Context using Spark Sql.

Thanks
Prabhu

From: Hustjackie >
Reply-To: "hustjac...@sina.cn" 
>
Date: Friday, April 8, 2016 at 1:58 PM
To: user >
Subject: May I ask a question about SparkSql


Hi all,



I have several jobs running with Spark-Streaming, but I prefer to run some sql 
to do the same things.

So does the SparkSql support real-time jobs, in another words, Do spark support 
spark streaming SQL.


Thanks in advance, any help are appreciate.



Jacky


Re: Converting a string of format of 'dd/MM/yyyy' in Spark sql

2016-03-24 Thread Kasinathan, Prabhu
Can you try this one?

spark-sql> select paymentdate, 
TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(paymentdate,'MM/dd/'),'-MM-dd')) 
from tmp;
10/02/2014 2014-10-02
spark-sql>


From: Tamas Szuromi 
>
Date: Thursday, March 24, 2016 at 9:35 AM
To: Mich Talebzadeh 
>
Cc: Ajay Chander >, Tamas 
Szuromi 
>, 
"user @spark" >
Subject: Re: Converting a string of format of 'dd/MM/' in Spark sql

Actually, you should run  sql("select paymentdate, unix_timestamp(paymentdate, 
"dd/MM/") from tmp").first


But keep in mind you will get a unix timestamp!


On 24 March 2016 at 17:29, Mich Talebzadeh 
> wrote:
Thanks guys.

Unfortunately neither is working

 sql("select paymentdate, unix_timestamp(paymentdate) from tmp").first
res28: org.apache.spark.sql.Row = [10/02/2014,null]



Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com



On 24 March 2016 at 14:23, Ajay Chander 
> wrote:
Mich,

Can you try the value for paymentdata to this format  paymentdata='2015-01-01 
23:59:59' , to_date(paymentdate) and see if it helps.


On Thursday, March 24, 2016, Tamas Szuromi 
> 
wrote:
Hi Mich,

Take a look 
https://spark.apache.org/docs/1.6.1/api/java/org/apache/spark/sql/functions.html#unix_timestamp(org.apache.spark.sql.Column,%20java.lang.String)

cheers,
Tamas


On 24 March 2016 at 14:29, Mich Talebzadeh  wrote:

Hi,

I am trying to convert a date in Spark temporary table

Tried few approaches.

scala> sql("select paymentdate, to_date(paymentdate) from tmp")
res21: org.apache.spark.sql.DataFrame = [paymentdate: string, _c1: date]


scala> sql("select paymentdate, to_date(paymentdate) from tmp").first
res22: org.apache.spark.sql.Row = [10/02/2014,null]


My date is stored as String dd/MM/ as shown above. However, to_date() 
returns null!


Thanks


Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com