t; .map(x => (x(0),x(2)))
>>>>
>>>> .map { case (key,value) =>
>>>>
>>>> (key,value.toArray.toSeq.sliding(2,1).map(x
>>>>
>>>> => x.sum/x.size))}.foreach(println)
>>>>
>>>>
;>>
>>> On Sun, Jul 31, 2016 at 12:03 AM, sri hari kali charan Tummala
>>>
>>> wrote:
>>>
>>>
>>> Hi All,
>>>
>>>
>>> I managed to write using sliding function but can it get key as well in
>>>
>>>
gt; 75.0
>>
>> -25.0
>>
>> 50.0
>>
>> -50.0
>>
>> -100.0
>>
>>
>> I want with key how to get moving average output based on key ?
>>
>>
>>
>> 987,75.0
>>
>> 987,
quot;\\~"))
>>>>>>> .map(x => (x(2).toDouble)).toArray().sliding(2,1).map(x =>
>>>>>>> (x,x.size)).foreach(println)
>>>>>>>
>>>>>>>
>>>>>>> at the moment my output:-
>>>>&g
ng how to write it up in scala or spark RDD.
>
>
> Thanks
>
> Sri
>
>
> On Sat, Jul 30, 2016 at 11:24 AM, Jacek Laskowski
>
> wrote:
>
>
> Why?
>
>
> Pozdrawiam,
>
> Jacek Laskowski
>
>
>
> https://medium.com/@jaceklaskowski/
&g
gt;>>>
>>>>> On Sat, Jul 30, 2016 at 11:40 AM, sri hari kali charan Tummala
>>>>> wrote:
>>>>>>
>>>>>> for knowledge just wondering how to write it up in scala or spark RDD.
>>>>>>
>>>>>>
anks
>> >>> Sri
>> >>>
>> >>> On Sat, Jul 30, 2016 at 11:24 AM, Jacek Laskowski
>> >>> wrote:
>> >>>>
>> >>>> Why?
>> >>>>
>> >>>> Pozdrawiam,
>> >>>> J
> >> -25.0
>>>> >> 50.0
>>>> >> -50.0
>>>> >> -100.0
>>>> >>
>>>> >> I want with key how to get moving average output based on key ?
>>>> >>
>>>> >>
>>>> &
;> >>
>>> >>
>>> >>
>>> >> On Sat, Jul 30, 2016 at 11:40 AM, sri hari kali charan Tummala
>>> >> wrote:
>>> >>>
>>> >>> for knowledge just wondering how to write it up in scala or spark
>>> RD
M, Jacek Laskowski
>> >>> wrote:
>> >>>>
>> >>>> Why?
>> >>>>
>> >>>> Pozdrawiam,
>> >>>> Jacek Laskowski
>> >>>>
>> >>>> https://medium.com/@jaceklaskows
s://medium.com/@jaceklaskowski/
> >>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> >>>> Follow me at https://twitter.com/jaceklaskowski
> >>>>
> >>>>
> >>>> On Sat, Jul 30, 2016 at 4:42 AM, kali.tum
t;>> On Sat, Jul 30, 2016 at 4:42 AM, kali.tumm...@gmail.com
>>>> wrote:
>>>> > Hi All,
>>>> >
>>>> > I managed to write business requirement in spark-sql and hive I am
>>>> > still
>>>> > learning scala how t
ski
>>>
>>>
>>> On Sat, Jul 30, 2016 at 4:42 AM, kali.tumm...@gmail.com
>>> wrote:
>>> > Hi All,
>>> >
>>> > I managed to write business requirement in spark-sql and hive I am
>>> still
>>> > learning scala ho
rement in spark-sql and hive I am still
>> > learning scala how this below sql be written using spark RDD not spark
>> data
>> > frames.
>> >
>> > SELECT DATE,balance,
>> > SUM(balance) OVER (ORDER BY DATE ROWS BETWEEN UNBOUNDED PRECEDING AND
>>
w this below sql be written using spark RDD not spark
> data
> > frames.
> >
> > SELECT DATE,balance,
> > SUM(balance) OVER (ORDER BY DATE ROWS BETWEEN UNBOUNDED PRECEDING AND
> > CURRENT ROW) daily_balance
> > FROM table
> >
> >
> >
> >
ble
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/sql-to-spark-scala-rdd-tp27433.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/sql-to-spark-scala-rdd-tp27433.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: user
17 matches
Mail list logo