Thanks guys.
Unfortunately neither is working
sql("select paymentdate, unix_timestamp(paymentdate) from tmp").first
res28: org.apache.spark.sql.Row = [10/02/2014,null]
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
http://talebzadehmich.wordpress.com
On 24 March 2016 at 14:23, Ajay Chander <[email protected]> wrote:
> Mich,
>
> Can you try the value for paymentdata to this
> format paymentdata='2015-01-01 23:59:59' , to_date(paymentdate) and see
> if it helps.
>
>
> On Thursday, March 24, 2016, Tamas Szuromi
> <[email protected]> wrote:
>
>> Hi Mich,
>>
>> Take a look
>> https://spark.apache.org/docs/1.6.1/api/java/org/apache/spark/sql/functions.html#unix_timestamp(org.apache.spark.sql.Column,%20java.lang.String)
>>
>> cheers,
>> Tamas
>>
>>
>> On 24 March 2016 at 14:29, Mich Talebzadeh <[email protected]>
>> wrote:
>>
>>>
>>> Hi,
>>>
>>> I am trying to convert a date in Spark temporary table
>>>
>>> Tried few approaches.
>>>
>>> scala> sql("select paymentdate, to_date(paymentdate) from tmp")
>>> res21: org.apache.spark.sql.DataFrame = [paymentdate: string, _c1: date]
>>>
>>>
>>> scala> sql("select paymentdate, to_date(paymentdate) from tmp").first
>>> *res22: org.apache.spark.sql.Row = [10/02/2014,null]*
>>>
>>> My date is stored as String dd/MM/yyyy as shown above. However,
>>> to_date() returns null!
>>>
>>>
>>> Thanks
>>>
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn *
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>>
>>
>>