You can try calling the "close()" method on your SparkContext, which
should allow for a cleaner shutdown.

On Thu, Oct 17, 2013 at 2:38 PM, Ameet Kini <[email protected]> wrote:
>
> I'm using the scala 2.10 branch of Spark in standalone mode, and am seeing
> the job reports itself as KILLED in the UI with the below message in each of
> the executors log, even though the job processes correctly and returns the
> correct result. The job is triggered by a .count on an RDD and the count
> seems right. The only thing I can thing of is I'm doing a System.exit(0) at
> the end of the main method. If I remove that call, I don't see the below
> message but the job hangs, and the UI reports it as still running.
>
>
>
>
> 13/10/17 15:31:52 INFO actor.LocalActorRef: Message
> [akka.remote.transport.AssociationHandle$Disassociated] from
> Actor[akka://spark/deadLetters] to
> Actor[akka://spark/system/transports/akkaprotocolmanager.tcp1/akkaProtocol-tcp%3A%2F%2Fspark%40ec2-cdh4u2-dev-master.geoeyeanalytics.ec2%3A47366-1#136073268]
> was not delivered. [1] dead letters encountered. This logging can be turned
> off or adjusted with configuration settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
> 13/10/17 15:31:52 ERROR executor.StandaloneExecutorBackend: Driver
> terminated or disconnected! Shutting down.
> 13/10/17 15:31:52 INFO actor.LocalActorRef: Message
> [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
> Actor[akka://spark/deadLetters] to
> Actor[akka://spark/system/transports/akkaprotocolmanager.tcp1/akkaProtocol-tcp%3A%2F%2Fspark%40ec2-cdh4u2-dev-master.geoeyeanalytics.ec2%3A47366-1#136073268]
> was not delivered. [2] dead letters encountered. This logging can be turned
> off or adjusted with configuration settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
> 13/10/17 15:31:52 INFO actor.LocalActorRef: Message
> [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
> Actor[akka://sparkExecutor/deadLetters] to
> Actor[akka://sparkExecutor/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40ec2-cdh4u2-dev-master.geoeyeanalytics.ec2%3A47366-1#593252773]
> was not delivered. [1] dead letters encountered. This logging can be turned
> off or adjusted with configuration settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
> 13/10/17 15:31:52 ERROR remote.EndpointWriter: AssociationError
> [akka.tcp://sparkExecutor@ec2-cdh4u2-dev-slave1:46566] ->
> [akka.tcp://spark@ec2-cdh4u2-dev-master:47366]: Error [Association failed
> with [akka.tcp://[email protected]:47366]] [
> akka.remote.EndpointAssociationException: Association failed with
> [akka.tcp://spark@ec2-cdh4u2-dev-master:47366]

Reply via email to