Which version of Java are you using ?

And release of Spark, please.

Thanks

On Fri, Sep 18, 2015 at 9:15 AM, swetha <swethakasire...@gmail.com> wrote:

> Hi,
>
> When I try to recover my Spark Streaming job from a checkpoint directory, I
> get a StackOverFlow Error as shown below. Any idea as to why this is
> happening?
>
> 15/09/18 09:02:20 ERROR streaming.StreamingContext: Error starting the
> context, marking it as stopped
> java.lang.StackOverflowError
>         at java.util.Date.getTimeImpl(Date.java:887)
>         at java.util.Date.getTime(Date.java:883)
>         at java.util.Calendar.setTime(Calendar.java:1106)
>         at java.text.SimpleDateFormat.format(SimpleDateFormat.java:955)
>         at java.text.SimpleDateFormat.format(SimpleDateFormat.java:948)
>         at java.text.DateFormat.format(DateFormat.java:298)
>         at java.text.Format.format(Format.java:157)
>         at
> org.apache.spark.streaming.ui.UIUtils$.formatBatchTime(UIUtils.scala:113)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$makeScope$1.apply(DStream.scala:137)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$makeScope$1.apply(DStream.scala:136)
>         at scala.Option.map(Option.scala:145)
>         at
> org.apache.spark.streaming.dstream.DStream.makeScope(DStream.scala:136)
>         at
>
> org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:394)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:344)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:342)
>         at scala.Option.orElse(Option.scala:257)
>         at
> org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:339)
>         at
>
> org.apache.spark.streaming.dstream.StateDStream.compute(StateDStream.scala:67)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:350)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:350)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:349)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:349)
>         at
>
> org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:344)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:342)
>         at scala.Option.orElse(Option.scala:257)
>         at
> org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:339)
>         at
>
> org.apache.spark.streaming.dstream.StateDStream.compute(StateDStream.scala:67)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:350)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:350)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:349)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:349)
>         at
>
> org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:344)
>         at
>
> org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:342)
>         at scala.Option.orElse(Option.scala:257)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-checkpoint-recovery-throws-Stack-Overflow-Error-tp24737.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to