Re: Is their a way to Create SparkContext object?
SparkContext is not serializable, so you can't send it across the cluster like the rdd.map(t => compute(sc, t._2)) would do. There is likely a way to express what you're trying to do with an algorithm that doesn't require serializing SparkContext. Can you tell us more about your goals? Andrew On Tue, May 13, 2014 at 2:14 AM, yh18190 wrote: > Thanks Mateh Zahria.Can i pass it as a parameter as part of closures. > for example > RDD.map(t=>compute(sc,t._2)) > > can I use sc inside map function?Pls let me know > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Is-their-a-way-to-Create-SparkContext-object-tp5612p5647.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >
Re: Is their a way to Create SparkContext object?
Thanks Mateh Zahria.Can i pass it as a parameter as part of closures. for example RDD.map(t=>compute(sc,t._2)) can I use sc inside map function?Pls let me know -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-their-a-way-to-Create-SparkContext-object-tp5612p5647.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Is their a way to Create SparkContext object?
Hi, Could anyone suggest an idea how can we create sparkContext object in other classes or fucntions where we need to convert a scala collection to RDD using sc object.like sc.makeRDD(list).instead of using Main class sparkcontext object? is their a way to pass sc object as a parameter to function in other classes? Please let me know -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-their-a-way-to-Create-SparkContext-object-tp5612.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: Is their a way to Create SparkContext object?
You can just pass it around as a parameter. On May 12, 2014, at 12:37 PM, yh18190 wrote: > Hi, > > Could anyone suggest an idea how can we create sparkContext object in other > classes or fucntions where we need to convert a scala collection to RDD > using sc object.like sc.makeRDD(list).instead of using Main class > sparkcontext object? > is their a way to pass sc object as a parameter to function in other > classes? > Please let me know > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Is-their-a-way-to-Create-SparkContext-object-tp5612.html > Sent from the Apache Spark User List mailing list archive at Nabble.com.