Re: spark sql current time stamp function ?

2015-12-07 Thread Ted Yu
BTW I forgot to mention that this was added through SPARK-11736 which went
into the upcoming 1.6.0 release

FYI

On Mon, Dec 7, 2015 at 12:53 PM, Ted Yu  wrote:

> scala> val test=sqlContext.sql("select monotonically_increasing_id() from
> t").show
> +---+
> |_c0|
> +---+
> |  0|
> |  1|
> |  2|
> +---+
>
> Cheers
>
> On Mon, Dec 7, 2015 at 12:48 PM, sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Ted,
>>
>> Gave and exception am I following right approach ?
>>
>> val test=sqlContext.sql("select *,  monotonicallyIncreasingId()  from kali")
>>
>>
>> On Mon, Dec 7, 2015 at 4:52 PM, Ted Yu  wrote:
>>
>>> Have you tried using monotonicallyIncreasingId ?
>>>
>>> Cheers
>>>
>>> On Mon, Dec 7, 2015 at 7:56 AM, Sri  wrote:
>>>
 Thanks , I found the right function current_timestamp().

 different Question:-
 Is there a row_number() function in spark SQL ? Not in Data frame just
 spark SQL?


 Thanks
 Sri

 Sent from my iPhone

 On 7 Dec 2015, at 15:49, Ted Yu  wrote:

 Does unix_timestamp() satisfy your needs ?
 See
 sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala

 On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
 kali.tumm...@gmail.com> wrote:

> I found a way out.
>
> import java.text.SimpleDateFormat
> import java.util.Date;
>
> val format = new SimpleDateFormat("-M-dd hh:mm:ss")
>
>  val testsql=sqlContext.sql("select
> column1,column2,column3,column4,column5
> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
> Date(
>
>
> Thanks
> Sri
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
> Sent from the Apache Spark User List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

>>>
>>
>>
>> --
>> Thanks & Regards
>> Sri Tummala
>>
>>
>


Re: spark sql current time stamp function ?

2015-12-07 Thread Ted Yu
scala> val test=sqlContext.sql("select monotonically_increasing_id() from
t").show
+---+
|_c0|
+---+
|  0|
|  1|
|  2|
+---+

Cheers

On Mon, Dec 7, 2015 at 12:48 PM, sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:

> Hi Ted,
>
> Gave and exception am I following right approach ?
>
> val test=sqlContext.sql("select *,  monotonicallyIncreasingId()  from kali")
>
>
> On Mon, Dec 7, 2015 at 4:52 PM, Ted Yu  wrote:
>
>> Have you tried using monotonicallyIncreasingId ?
>>
>> Cheers
>>
>> On Mon, Dec 7, 2015 at 7:56 AM, Sri  wrote:
>>
>>> Thanks , I found the right function current_timestamp().
>>>
>>> different Question:-
>>> Is there a row_number() function in spark SQL ? Not in Data frame just
>>> spark SQL?
>>>
>>>
>>> Thanks
>>> Sri
>>>
>>> Sent from my iPhone
>>>
>>> On 7 Dec 2015, at 15:49, Ted Yu  wrote:
>>>
>>> Does unix_timestamp() satisfy your needs ?
>>> See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
>>>
>>> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
>>> kali.tumm...@gmail.com> wrote:
>>>
 I found a way out.

 import java.text.SimpleDateFormat
 import java.util.Date;

 val format = new SimpleDateFormat("-M-dd hh:mm:ss")

  val testsql=sqlContext.sql("select
 column1,column2,column3,column4,column5
 ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
 Date(


 Thanks
 Sri



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com
 .

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


>>>
>>
>
>
> --
> Thanks & Regards
> Sri Tummala
>
>


Re: spark sql current time stamp function ?

2015-12-07 Thread sri hari kali charan Tummala
Hi Ted,

Gave and exception am I following right approach ?

val test=sqlContext.sql("select *,  monotonicallyIncreasingId()  from kali")


On Mon, Dec 7, 2015 at 4:52 PM, Ted Yu  wrote:

> Have you tried using monotonicallyIncreasingId ?
>
> Cheers
>
> On Mon, Dec 7, 2015 at 7:56 AM, Sri  wrote:
>
>> Thanks , I found the right function current_timestamp().
>>
>> different Question:-
>> Is there a row_number() function in spark SQL ? Not in Data frame just
>> spark SQL?
>>
>>
>> Thanks
>> Sri
>>
>> Sent from my iPhone
>>
>> On 7 Dec 2015, at 15:49, Ted Yu  wrote:
>>
>> Does unix_timestamp() satisfy your needs ?
>> See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
>>
>> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
>> kali.tumm...@gmail.com> wrote:
>>
>>> I found a way out.
>>>
>>> import java.text.SimpleDateFormat
>>> import java.util.Date;
>>>
>>> val format = new SimpleDateFormat("-M-dd hh:mm:ss")
>>>
>>>  val testsql=sqlContext.sql("select
>>> column1,column2,column3,column4,column5
>>> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
>>> Date(
>>>
>>>
>>> Thanks
>>> Sri
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>


-- 
Thanks & Regards
Sri Tummala


Re: spark sql current time stamp function ?

2015-12-07 Thread Ted Yu
Have you tried using monotonicallyIncreasingId ?

Cheers

On Mon, Dec 7, 2015 at 7:56 AM, Sri  wrote:

> Thanks , I found the right function current_timestamp().
>
> different Question:-
> Is there a row_number() function in spark SQL ? Not in Data frame just
> spark SQL?
>
>
> Thanks
> Sri
>
> Sent from my iPhone
>
> On 7 Dec 2015, at 15:49, Ted Yu  wrote:
>
> Does unix_timestamp() satisfy your needs ?
> See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
>
> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
> kali.tumm...@gmail.com> wrote:
>
>> I found a way out.
>>
>> import java.text.SimpleDateFormat
>> import java.util.Date;
>>
>> val format = new SimpleDateFormat("-M-dd hh:mm:ss")
>>
>>  val testsql=sqlContext.sql("select
>> column1,column2,column3,column4,column5
>> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
>> Date(
>>
>>
>> Thanks
>> Sri
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


RE: spark sql current time stamp function ?

2015-12-07 Thread Mich Talebzadeh
Or try this

 

cast(from_unixtime(unix_timestamp()) AS timestamp

 

HTH

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Ted Yu [mailto:yuzhih...@gmail.com] 
Sent: 07 December 2015 15:49
To: kali.tumm...@gmail.com
Cc: user 
Subject: Re: spark sql current time stamp function ?

 

Does unix_timestamp() satisfy your needs ?

See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala

 

On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com 
<mailto:kali.tumm...@gmail.com>  mailto:kali.tumm...@gmail.com> > wrote:

I found a way out.

import java.text.SimpleDateFormat
import java.util.Date;

val format = new SimpleDateFormat("-M-dd hh:mm:ss")

 val testsql=sqlContext.sql("select column1,column2,column3,column4,column5
,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
Date(


Thanks
Sri



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
<mailto:user-unsubscr...@spark.apache.org> 
For additional commands, e-mail: user-h...@spark.apache.org 
<mailto:user-h...@spark.apache.org> 

 



Re: spark sql current time stamp function ?

2015-12-07 Thread Sri
Thanks , I found the right function current_timestamp().

different Question:-
Is there a row_number() function in spark SQL ? Not in Data frame just spark 
SQL?


Thanks
Sri

Sent from my iPhone

> On 7 Dec 2015, at 15:49, Ted Yu  wrote:
> 
> Does unix_timestamp() satisfy your needs ?
> See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
> 
>> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com 
>>  wrote:
>> I found a way out.
>> 
>> import java.text.SimpleDateFormat
>> import java.util.Date;
>> 
>> val format = new SimpleDateFormat("-M-dd hh:mm:ss")
>> 
>>  val testsql=sqlContext.sql("select column1,column2,column3,column4,column5
>> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
>> Date(
>> 
>> 
>> Thanks
>> Sri
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
> 


Re: spark sql current time stamp function ?

2015-12-07 Thread Ted Yu
Does unix_timestamp() satisfy your needs ?
See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala

On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
kali.tumm...@gmail.com> wrote:

> I found a way out.
>
> import java.text.SimpleDateFormat
> import java.util.Date;
>
> val format = new SimpleDateFormat("-M-dd hh:mm:ss")
>
>  val testsql=sqlContext.sql("select column1,column2,column3,column4,column5
> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
> Date(
>
>
> Thanks
> Sri
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: spark sql current time stamp function ?

2015-12-07 Thread kali.tumm...@gmail.com
I found a way out.

import java.text.SimpleDateFormat
import java.util.Date;

val format = new SimpleDateFormat("-M-dd hh:mm:ss")

 val testsql=sqlContext.sql("select column1,column2,column3,column4,column5
,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
Date(


Thanks
Sri 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org