Hi,
Can not reproduce your error on Spark 1.2.1 . It is not enough information.
What is your command line arguments wцру you starting spark-shell? what data
are you reading? etc.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-shell-and
> error
>>>>>> >> >> >>>>> happens
>>>>>> >> >> >>>>> when I try to pass a variable into the closure. The
>>>>>> example you
>>>>>> >> >> >>>>> have
>>>>>> >> >> >>>&
>>> >> >> >>>>>>
>>>>> >> >> >>>>>> Using Spark shell :
>>>>> >> >> >>>>>>
>>>>> >> >> >>>>>> scala> import scala.coll
;>>>>> MutableList()
>>>> >> >> >>>>>>
>>>> >> >> >>>>>> scala>
>>>> Range(0,1).foreach(i=>lst+=(("10","10",i:Double)))
>>>> >> >> >>>>>>
>>>>
>> >> >>>>>>
>>> >> >> >>>>>> scala> rdd.count()
>>> >> >> >>>>>>
>> >>>>>> :30, took 0.478350 s
>> >> >> >>>>>> res1: Long = 1
>> >> >> >>>>>>
>> >> >> >>>>>> Ashish:
>> >> >> >>>>>> Please refine your ex
; >> >>>>>>>
> >> >> >>>>>>> That can't cause any error, since there is no action in your
> >> >>
t;> >>>>>>> You
>> >> >>>>>>> must be executing something different.
>> >> >>>>>>>
>> >> >>
get
>> >>>>>>> > a
>> >>>>>>> > StackOverFlowError each time I try to run the following code
>> >>>>>>> > (the
>> >>>>>>> > code
>> >>>>>>> > itsel
t;>>>>> > val a=10
>>>>>>> > rdd.map(r => if (a==10) 1 else 0)
>>>>>>> > This throws -
>>>>>>> >
>>>>>>> > java.lang.StackOverflowError
>>>>>>> > at java.io.Obje
treamClass.lookup(ObjectStreamClass.java:318)
>>>>>> > at
>>>>>> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1133)
>>>>>> > at
>>>>>> >
>>>>>> java.io.ObjectOutputStream.de
ta(ObjectOutputStream.java:1508)
>>>> > at
>>>> >
>>>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
>>>> > at
>>>> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
&g
ObjectOutputStream.java:1508)
>>> > at
>>> >
>>> java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
>>> > ...
>>> > ...
>>> >
>>> > More experiments .. this works -
>>> >
>
1 else 0)
> >
> > But below doesn't and throws the StackoverflowError -
> >
> > val lst = MutableList[(String,String,Double)]()
> > Range(0,1).foreach(i=>lst+=(("10","10",i:Double)))
> > sc.makeRDD(lst).map(i=> if(a==10) 1 else
0)
>
> But below doesn't and throws the StackoverflowError -
>
> val lst = MutableList[(String,String,Double)]()
> Range(0,1).foreach(i=>lst+=(("10","10",i:Double)))
> sc.makeRDD(lst).map(i=> if(a==10) 1 else 0)
>
> Any help appreciated!
>
&
)]()
Range(0,1).foreach(i=>lst+=(("10","10",i:Double)))
sc.makeRDD(lst).map(i=> if(a==10) 1 else 0)
Any help appreciated!
Thanks,
Ashish
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-shell-and-StackOverFlowError-tp24508.html
16 matches
Mail list logo