[ 
https://issues.apache.org/jira/browse/SPARK-11065?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-11065:
------------------------------
    Priority: Minor  (was: Major)

I am not sure this is actually a problem; during shutdown it could be that some 
components stop being able to talk to others that are shutting down. was the 
result otherwise success?

Certainly, if the noise can be reliably avoided, that's best.

> IOException thrown at job submit shutdown
> -----------------------------------------
>
>                 Key: SPARK-11065
>                 URL: https://issues.apache.org/jira/browse/SPARK-11065
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>            Reporter: Jean-Baptiste Onofré
>            Priority: Minor
>
> When submitted a job (for instance JavaWordCount example), even if the job 
> works fine, at the end of execution, we can see:
> {code}
> checkForCorruptJournalFiles="true": 1
> 15/10/12 16:31:12 INFO SparkUI: Stopped Spark web UI at 
> http://192.168.134.10:4040
> 15/10/12 16:31:12 INFO DAGScheduler: Stopping DAGScheduler
> 15/10/12 16:31:12 INFO SparkDeploySchedulerBackend: Shutting down all 
> executors
> 15/10/12 16:31:12 INFO SparkDeploySchedulerBackend: Asking each executor to 
> shut down
> 15/10/12 16:31:12 INFO MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/10/12 16:31:12 INFO MemoryStore: MemoryStore cleared
> 15/10/12 16:31:12 INFO BlockManager: BlockManager stopped
> 15/10/12 16:31:12 INFO BlockManagerMaster: BlockManagerMaster stopped
> 15/10/12 16:31:12 INFO 
> OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/10/12 16:31:12 ERROR TransportResponseHandler: Still have 1 requests 
> outstanding when connection from localhost/127.0.0.1:7077 is closed
> 15/10/12 16:31:12 ERROR NettyRpcEnv: Exception when sending 
> RequestMessage(192.168.134.10:40548,NettyRpcEndpointRef(spark://Master@localhost:7077),UnregisterApplication(app-20151012163109-0000),false)
> java.io.IOException: Connection from localhost/127.0.0.1:7077 closed
>         at 
> org.apache.spark.network.client.TransportResponseHandler.channelUnregistered(TransportResponseHandler.java:104)
>         at 
> org.apache.spark.network.server.TransportChannelHandler.channelUnregistered(TransportChannelHandler.java:91)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at 
> io.netty.channel.ChannelInboundHandlerAdapter.channelUnregistered(ChannelInboundHandlerAdapter.java:53)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelUnregistered(AbstractChannelHandlerContext.java:158)
>         at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelUnregistered(AbstractChannelHandlerContext.java:144)
>         at 
> io.netty.channel.DefaultChannelPipeline.fireChannelUnregistered(DefaultChannelPipeline.java:739)
>         at 
> io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:659)
>         at 
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>         at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>         at java.lang.Thread.run(Thread.java:745)
> 15/10/12 16:31:12 INFO RemoteActorRefProvider$RemotingTerminator: Shutting 
> down remote daemon.
> 15/10/12 16:31:12 INFO RemoteActorRefProvider$RemotingTerminator: Remote 
> daemon shut down; proceeding with flushing remote transports.
> 15/10/12 16:31:12 INFO SparkContext: Successfully stopped SparkContext
> 15/10/12 16:31:12 INFO ShutdownHookManager: Shutdown hook called
> 15/10/12 16:31:12 INFO ShutdownHookManager: Deleting directory 
> /tmp/spark-81bc4324-1268-4e54-bdd2-f7a2a36dafd4
> {code}
> I gonna investigate about that and I will submit a PR.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to