Thanks everyone. Appreciated
sql("select paymentdate,
TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(paymentdate,'MM/dd/yyyy'),'yyyy-MM-dd'))
from tmp").first
res45: org.apache.spark.sql.Row = [10/02/2014,2014-10-02]
Breaking a nut with sledgehammer :)
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
http://talebzadehmich.wordpress.com
On 24 March 2016 at 17:03, Kasinathan, Prabhu <[email protected]>
wrote:
> Can you try this one?
>
> spark-sql> select paymentdate,
> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(paymentdate,'MM/dd/yyyy'),'yyyy-MM-dd'))
> from tmp;
> 10/02/2014 2014-10-02
> spark-sql>
>
>
> From: Tamas Szuromi <[email protected]>
> Date: Thursday, March 24, 2016 at 9:35 AM
> To: Mich Talebzadeh <[email protected]>
> Cc: Ajay Chander <[email protected]>, Tamas Szuromi <
> [email protected]>, "user @spark" <[email protected]>
> Subject: Re: Converting a string of format of 'dd/MM/yyyy' in Spark sql
>
> Actually, you should run sql("select paymentdate,
> unix_timestamp(paymentdate, "dd/MM/yyyy") from tmp").first
>
>
> But keep in mind you will get a unix timestamp!
>
>
> On 24 March 2016 at 17:29, Mich Talebzadeh <[email protected]>
> wrote:
>
>> Thanks guys.
>>
>> Unfortunately neither is working
>>
>> sql("select paymentdate, unix_timestamp(paymentdate) from tmp").first
>> res28: org.apache.spark.sql.Row = [10/02/2014,null]
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn *
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 24 March 2016 at 14:23, Ajay Chander <[email protected]> wrote:
>>
>>> Mich,
>>>
>>> Can you try the value for paymentdata to this
>>> format paymentdata='2015-01-01 23:59:59' , to_date(paymentdate) and
>>> see if it helps.
>>>
>>>
>>> On Thursday, March 24, 2016, Tamas Szuromi <
>>> [email protected]> wrote:
>>>
>>>> Hi Mich,
>>>>
>>>> Take a look
>>>> https://spark.apache.org/docs/1.6.1/api/java/org/apache/spark/sql/functions.html#unix_timestamp(org.apache.spark.sql.Column,%20java.lang.String)
>>>>
>>>> cheers,
>>>> Tamas
>>>>
>>>>
>>>> On 24 March 2016 at 14:29, Mich Talebzadeh <[email protected]>
>>>> wrote:
>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>> I am trying to convert a date in Spark temporary table
>>>>>
>>>>> Tried few approaches.
>>>>>
>>>>> scala> sql("select paymentdate, to_date(paymentdate) from tmp")
>>>>> res21: org.apache.spark.sql.DataFrame = [paymentdate: string, _c1:
>>>>> date]
>>>>>
>>>>>
>>>>> scala> sql("select paymentdate, to_date(paymentdate) from tmp").first
>>>>> *res22: org.apache.spark.sql.Row = [10/02/2014,null]*
>>>>>
>>>>> My date is stored as String dd/MM/yyyy as shown above. However,
>>>>> to_date() returns null!
>>>>>
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>> Dr Mich Talebzadeh
>>>>>
>>>>>
>>>>>
>>>>> LinkedIn *
>>>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>>
>>>>>
>>>>>
>>>>> http://talebzadehmich.wordpress.com
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>
>