Re: Spark shell and StackOverFlowError

2015-09-01 Thread ponkin
Hi, Can not reproduce your error on Spark 1.2.1 . It is not enough information. What is your command line arguments wцру you starting spark-shell? what data are you reading? etc. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-shell-and

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ted Yu
> error >>>>>> >> >> >>>>> happens >>>>>> >> >> >>>>> when I try to pass a variable into the closure. The >>>>>> example you >>>>>> >> >> >>>>> have >>>>>> >> >> >>>&

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ashish Shrowty
>>> >> >> >>>>>> >>>>> >> >> >>>>>> Using Spark shell : >>>>> >> >> >>>>>> >>>>> >> >> >>>>>> scala> import scala.coll

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ted Yu
;>>>>> MutableList() >>>> >> >> >>>>>> >>>> >> >> >>>>>> scala> >>>> Range(0,1).foreach(i=>lst+=(("10","10",i:Double))) >>>> >> >> >>>>>> >>>>

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ashish Shrowty
>> >> >>>>>> >>> >> >> >>>>>> scala> rdd.count() >>> >> >> >>>>>>

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ted Yu
>> >>>>>> :30, took 0.478350 s >> >> >> >>>>>> res1: Long = 1 >> >> >> >>>>>> >> >> >> >>>>>> Ashish: >> >> >> >>>>>> Please refine your ex

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Ashish Shrowty
; >> >>>>>>> > >> >> >>>>>>> That can't cause any error, since there is no action in your > >> >>

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Sean Owen
t;> >>>>>>> You >> >> >>>>>>> must be executing something different. >> >> >>>>>>> >> >> >>

Re: Spark shell and StackOverFlowError

2015-08-31 Thread Sean Owen
get >> >>>>>>> > a >> >>>>>>> > StackOverFlowError each time I try to run the following code >> >>>>>>> > (the >> >>>>>>> > code >> >>>>>>> > itsel

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Sean Owen
t;>>>>> > val a=10 >>>>>>> > rdd.map(r => if (a==10) 1 else 0) >>>>>>> > This throws - >>>>>>> > >>>>>>> > java.lang.StackOverflowError >>>>>>> > at java.io.Obje

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Ashish Shrowty
treamClass.lookup(ObjectStreamClass.java:318) >>>>>> > at >>>>>> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1133) >>>>>> > at >>>>>> > >>>>>> java.io.ObjectOutputStream.de

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Ashish Shrowty
ta(ObjectOutputStream.java:1508) >>>> > at >>>> > >>>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) >>>> > at >>>> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) &g

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Ted Yu
ObjectOutputStream.java:1508) >>> > at >>> > >>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) >>> > ... >>> > ... >>> > >>> > More experiments .. this works - >>> > >

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Ted Yu
1 else 0) > > > > But below doesn't and throws the StackoverflowError - > > > > val lst = MutableList[(String,String,Double)]() > > Range(0,1).foreach(i=>lst+=(("10","10",i:Double))) > > sc.makeRDD(lst).map(i=> if(a==10) 1 else

Re: Spark shell and StackOverFlowError

2015-08-30 Thread Sean Owen
0) > > But below doesn't and throws the StackoverflowError - > > val lst = MutableList[(String,String,Double)]() > Range(0,1).foreach(i=>lst+=(("10","10",i:Double))) > sc.makeRDD(lst).map(i=> if(a==10) 1 else 0) > > Any help appreciated! > &

Spark shell and StackOverFlowError

2015-08-29 Thread ashrowty
)]() Range(0,1).foreach(i=>lst+=(("10","10",i:Double))) sc.makeRDD(lst).map(i=> if(a==10) 1 else 0) Any help appreciated! Thanks, Ashish -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-shell-and-StackOverFlowError-tp24508.html