Have you tried using monotonicallyIncreasingId ?

Cheers

On Mon, Dec 7, 2015 at 7:56 AM, Sri <kali.tumm...@gmail.com> wrote:

> Thanks , I found the right function current_timestamp().
>
> different Question:-
> Is there a row_number() function in spark SQL ? Not in Data frame just
> spark SQL?
>
>
> Thanks
> Sri
>
> Sent from my iPhone
>
> On 7 Dec 2015, at 15:49, Ted Yu <yuzhih...@gmail.com> wrote:
>
> Does unix_timestamp() satisfy your needs ?
> See sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
>
> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
> kali.tumm...@gmail.com> wrote:
>
>> I found a way out.
>>
>> import java.text.SimpleDateFormat
>> import java.util.Date;
>>
>> val format = new SimpleDateFormat("yyyy-M-dd hh:mm:ss")
>>
>>  val testsql=sqlContext.sql("select
>> column1,column2,column3,column4,column5
>> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
>> Date())))
>>
>>
>> Thanks
>> Sri
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to