michael procopio created SPARK-19030: ----------------------------------------
Summary: Dropped event errors being reported after SparkContext has been stopped Key: SPARK-19030 URL: https://issues.apache.org/jira/browse/SPARK-19030 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 2.0.2 Environment: Debian 8 using spark-submit with MATLAB integration spark code is being code using java. Reporter: michael procopio Priority: Minor After stop has been called on SparkContext, errors are being reported. 6/12/29 15:54:04 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerExecutorMetricsUpdate(2,WrappedArray()) The stack in the hearbeat thread is at the point where the error is thrown is: Daemon Thread [heartbeat-receiver-event-loop-thread] (Suspended (breakpoint at line 124 in LiveListenerBus)) LiveListenerBus.post(SparkListenerEvent) line: 124 DAGScheduler.executorHeartbeatReceived(String, Tuple4<Object,Object,Object,Seq<AccumulableInfo>>[], BlockManagerId) line: 228 YarnScheduler(TaskSchedulerImpl).executorHeartbeatReceived(String, Tuple2<Object,Seq<AccumulatorV2<?,?>>>[], BlockManagerId) line: 402 HeartbeatReceiver$$anonfun$receiveAndReply$1$$anon$2$$anonfun$run$2.apply$mcV$sp() line: 128 Utils$.tryLogNonFatalError(Function0<BoxedUnit>) line: 1290 HeartbeatReceiver$$anonfun$receiveAndReply$1$$anon$2.run() line: 127 Executors$RunnableAdapter<T>.call() line: 511 ScheduledThreadPoolExecutor$ScheduledFutureTask<V>(FutureTask<V>).run() line: 266 ScheduledThreadPoolExecutor$ScheduledFutureTask<V>.access$201(ScheduledThreadPoolExecutor$ScheduledFutureTask) line: 180 ScheduledThreadPoolExecutor$ScheduledFutureTask<V>.run() line: 293 ScheduledThreadPoolExecutor(ThreadPoolExecutor).runWorker(ThreadPoolExecutor$Worker) line: 1142 ThreadPoolExecutor$Worker.run() line: 617 Thread.run() line: 745 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org