Can you show code snippet and the exception for 'Task is not serializable' ?

Please see related JIRA:
  SPARK-10251
whose pull request contains code for registering classes with Kryo.

Cheers

On Tue, Mar 22, 2016 at 7:00 AM, Hafsa Asif <hafsa.a...@matchinguu.com>
wrote:

> Hello,
> I am facing Spark serialization issue in Spark (1.4.1 - Java Client) with
> Spring Framework. It is known that Spark needs serialization and it
> requires
> every class need to be implemented with java.io.Serializable. But, in the
> documentation link: http://spark.apache.org/docs/latest/tuning.html, it is
> mentioned that it is not a good approach and better to use Kryo.
> I am using Kryo in Spark configuration like this:
>   public @Bean DeepSparkContext sparkContext(){
>         DeepSparkConfig conf = new DeepSparkConfig();
>         conf.setAppName(this.environment.getProperty("APP_NAME"))
>             .setMaster(master)
>             .set("spark.executor.memory",
> this.environment.getProperty("SPARK_EXECUTOR_MEMORY"))
>             .set("spark.cores.max",
> this.environment.getProperty("SPARK_CORES_MAX"))
>             .set("spark.default.parallelism",
> this.environment.getProperty("SPARK_DEFAULT_PARALLELISM"));
>         conf.set("spark.serializer",
> "org.apache.spark.serializer.KryoSerializer");
>         return new DeepSparkContext(conf);
>     }
>
> but still getting exception in Spark that 'Task is not serializable'. I
> also
> donot want to make spark contect 'static'.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to