One month later, the same problem. I think that someone (e.g. inventors of
Spark) should show us a big example of how to use accumulators. I can start
telling that we need to see an example of the following form:

val accum = sc.accumulator(0)
sc.parallelize(Array(1, 2, 3, 4)).map(x => foo(x,accum))

Passing accum as a parameter to function foo will require it to be
serializable, but, a.f.a.i.k  any accumulator incapsulates the spark context
sc which is not serializable and which lead a
"java.io.NotSerializableException: SparkContext"  exception. 

I am really curious to see a real application that uses accumulators.
Otherwise, you have to change their code such that the above issue does not
appear anymore.

Best,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19567.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to