Just to mark this question closed - we expierienced an OOM-Exception on
the Master, which we didn't see on the Driver, but made him crash.

Am 24.03.2016 um 09:54 schrieb Max Schmidt:
> Hi there,
>
> we're using with the java-api (1.6.0) a ScheduledExecutor that
> continuously executes a SparkJob to a standalone cluster.
>
> After each job we close the JavaSparkContext and create a new one.
>
> But sometimes the Scheduling JVM crashes with:
>
> 24.03.2016-08:30:27:375# error - Application has been killed. Reason:
> All masters are unresponsive! Giving up.
> 24.03.2016-08:30:27:398# error - Error initializing SparkContext.
> java.lang.IllegalStateException: Cannot call methods on a stopped
> SparkContext.
> This stopped SparkContext was created at:
>
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
> io.datapath.spark.AbstractSparkJob.createJavaSparkContext(AbstractSparkJob.java:53)
> io.datapath.measurement.SparkJobMeasurements.work(SparkJobMeasurements.java:130)
> io.datapath.measurement.SparkMeasurementScheduler.lambda$submitSparkJobMeasurement$30(SparkMeasurementScheduler.java:117)
> io.datapath.measurement.SparkMeasurementScheduler$$Lambda$17/1568787282.run(Unknown
> Source)
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> java.util.concurrent.FutureTask.run(FutureTask.java:266)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> java.lang.Thread.run(Thread.java:745)
>
> The currently active SparkContext was created at:
>
> (No active SparkContext.)
>
>         at
> org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:106)
>         at
> org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1578)
>         at
> org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2179)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
>         at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
>         at
> io.datapath.spark.AbstractSparkJob.createJavaSparkContext(AbstractSparkJob.java:53)
>         at
> io.datapath.measurement.SparkJobMeasurements.work(SparkJobMeasurements.java:130)
>         at
> io.datapath.measurement.SparkMeasurementScheduler.lambda$submitSparkJobMeasurement$30(SparkMeasurementScheduler.java:117)
>         at
> io.datapath.measurement.SparkMeasurementScheduler$$Lambda$17/1568787282.run(Unknown
> Source)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
> 24.03.2016-08:30:27:402# info - SparkMeasurement - finished.
>
> Any guess?
> -- 
> *Max Schmidt, Senior Java Developer* | m...@datapath.io | LinkedIn
> <https://www.linkedin.com/in/maximilian-schmidt-9893b7bb/>
> Datapath.io
>  
> Decreasing AWS latency.
> Your traffic optimized.
>
> Datapath.io GmbH
> Mainz | HRB Nr. 46222
> Sebastian Spies, CEO
>

-- 
*Max Schmidt, Senior Java Developer* | m...@datapath.io
<mailto:m...@datapath.io> | LinkedIn
<https://www.linkedin.com/in/maximilian-schmidt-9893b7bb/>
Datapath.io
 
Decreasing AWS latency.
Your traffic optimized.

Datapath.io GmbH
Mainz | HRB Nr. 46222
Sebastian Spies, CEO

Reply via email to