HI Abhishek,

I have even tried that but rdd2 is empty

Regards,
Satish

On Fri, Aug 21, 2015 at 6:47 PM, Abhishek R. Singh <
abhis...@tetrationanalytics.com> wrote:

> You had:
>
> > RDD.reduceByKey((x,y) => x+y)
> > RDD.take(3)
>
> Maybe try:
>
> > rdd2 = RDD.reduceByKey((x,y) => x+y)
> > rdd2.take(3)
>
> -Abhishek-
>
> On Aug 20, 2015, at 3:05 AM, satish chandra j <jsatishchan...@gmail.com>
> wrote:
>
> > HI All,
> > I have data in RDD as mentioned below:
> >
> > RDD : Array[(Int),(Int)] = Array((0,1), (0,2),(1,20),(1,30),(2,40))
> >
> >
> > I am expecting output as Array((0,3),(1,50),(2,40)) just a sum function
> on Values for each key
> >
> > Code:
> > RDD.reduceByKey((x,y) => x+y)
> > RDD.take(3)
> >
> > Result in console:
> > RDD: org.apache.spark.rdd.RDD[(Int,Int)]= ShuffledRDD[1] at reduceByKey
> at <console>:73
> > res:Array[(Int,Int)] = Array()
> >
> > Command as mentioned
> >
> > dse spark --master local --jars postgresql-9.4-1201.jar -i  <ScriptFile>
> >
> >
> > Please let me know what is missing in my code, as my resultant Array is
> empty
> >
> >
> >
> > Regards,
> > Satish
> >
>
>

Reply via email to