Ok.
I modified the code to remove sc as sc is never serializable and must not
be passed to map functions.

On Thu, Jun 25, 2015 at 11:11 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> Spark Version: 1.3.1
>
> How can SparkContext not be serializable.
> Any suggestions to resolve this issue ?
>
> I included a trait + implementation (implmentation has a method that takes
> SC as argument) and i started seeing this exception
>
>
> trait DetailDataProvider[T1 <: Data] extends java.io.Serializable {
>
>   def getData(sessions: RDD[DetailInputRecord], startDate: Date, endDate:
> Date, sc: SparkContext): RDD[(DetailInputRecord, T1)]
>
> }
>
>
> class ViewItemCountMetrics extends
> DetailMetricProvider[ViewItemCountMetric] {
>
>   def getMetrics(details: List[(DetailInputRecord, DataRecord)], sc:
> SparkContext) = {
>
>     val totalViCount = details.size.toLong
>
>     val uniqueViCount = details.map(_._1.get("itemId"
> ).asInstanceOf[Long]).distinct.size.toLong
>
>     new ViewItemCountMetric(totalViCount, uniqueViCount)
>
>   }
>
> }
>
> Any suggestions ?
>
> Logs
> ====
> 15/06/25 11:06:58 ERROR yarn.ApplicationMaster: User class threw
> exception: Task not serializable
> org.apache.spark.SparkException: Task not serializable
> at
> org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158)
> at org.apache.spark.SparkContext.clean(SparkContext.scala:1623)
> at org.apache.spark.rdd.RDD.map(RDD.scala:286)
> at
> com.ebay.ep.poc.spark.reporting.process.detail.AbstractInputHelper.processRecords(AbstractInputHelper.scala:126)
> at
> com.ebay.ep.poc.spark.reporting.process.detail.AbstractInputHelper.execute(AbstractInputHelper.scala:82)
> at
> com.ebay.ep.poc.spark.reporting.process.service.VIDetailService.execute(VIDetailService.scala:26)
> at com.ebay.ep.poc.spark.reporting.SparkApp$.main(SparkApp.scala:50)
> at com.ebay.ep.poc.spark.reporting.SparkApp.main(SparkApp.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
> *Caused by: java.io.NotSerializableException:
> org.apache.spark.SparkContext*
> *Serialization stack:*
> Caused by: java.io.NotSerializableException: org.apache.spark.SparkContext
> Serialization stack:
> - object not serializable (class: org.apache.spark.SparkContext, value:
> org.apache.spark.SparkContext@4f4da28b)
> - field (class:
> com.ebay.ep.poc.spark.reporting.process.detail.AbstractInputHelper$$anonfun$4,
> name: sc$1, type: class org.apache.spark.SparkContext)
> - object (class
> com.ebay.ep.poc.spark.reporting.process.detail.AbstractInputHelper$$anonfun$4,
> <function1>)
> at
> org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:38)
> at
> org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
> at
> org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:80)
> at
> org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:164)
> ... 13 more
> 15/06/25 11:06:58 INFO yarn.ApplicationMaster: Final app status: FAILED,
> exitCode: 15, (reason: User class threw exception: Task not serializable)
> 15/06/25 11:06:58 INFO yarn.ApplicationMaster: Invoking sc stop from
> shutdown hook
>
>
> --
> Deepak
>
>


-- 
Deepak

Reply via email to