causevic commented on pull request #28437:
URL: https://github.com/apache/spark/pull/28437#issuecomment-1046097531


   I see the same issue in 3.2.1. 
   
   ```
   22/02/19 20:24:47 WARN AbstractConnector:
   java.io.IOException: No such file or directory
        at sun.nio.ch.NativeThread.signal(Native Method)
        at 
sun.nio.ch.ServerSocketChannelImpl.implCloseSelectableChannel(ServerSocketChannelImpl.java:291)
        at 
java.nio.channels.spi.AbstractSelectableChannel.implCloseChannel(AbstractSelectableChannel.java:241)
        at 
java.nio.channels.spi.AbstractInterruptibleChannel.close(AbstractInterruptibleChannel.java:115)
        at 
org.sparkproject.jetty.server.ServerConnector.close(ServerConnector.java:371)
        at 
org.sparkproject.jetty.server.AbstractNetworkConnector.shutdown(AbstractNetworkConnector.java:104)
        at org.sparkproject.jetty.server.Server.doStop(Server.java:444)
        at 
org.sparkproject.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:94)
        at org.apache.spark.ui.ServerInfo.stop(JettyUtils.scala:525)
        at org.apache.spark.ui.WebUI.$anonfun$stop$2(WebUI.scala:174)
        at org.apache.spark.ui.WebUI.$anonfun$stop$2$adapted(WebUI.scala:174)
        at scala.Option.foreach(Option.scala:407)
        at org.apache.spark.ui.WebUI.stop(WebUI.scala:174)
        at org.apache.spark.ui.SparkUI.stop(SparkUI.scala:101)
        at 
org.apache.spark.SparkContext.$anonfun$stop$6(SparkContext.scala:2071)
        at 
org.apache.spark.SparkContext.$anonfun$stop$6$adapted(SparkContext.scala:2071)
        at scala.Option.foreach(Option.scala:407)
        at 
org.apache.spark.SparkContext.$anonfun$stop$5(SparkContext.scala:2071)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:2071)
        at 
org.apache.spark.api.java.JavaSparkContext.stop(JavaSparkContext.scala:550)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:282)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
        at java.lang.Thread.run(Thread.java:750)
   22/02/19 20:25:47 WARN QueuedThreadPool: 
QueuedThreadPool[SparkUI]@dae963d{STOPPING,8<=0<=200,i=8,r=-1,q=0}[NO_TRY] 
Couldn't stop 
Thread[SparkUI-35-acceptor-0@29ff27df-ServerConnector@2738dc90{HTTP/1.1, 
(http/1.1)}{0.0.0.0:4040},3,main]
   22/02/19 20:25:47 INFO SparkUI: Stopped Spark web UI at 
http://95ba7aa02538:4040
   22/02/19 20:25:47 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   22/02/19 20:25:47 INFO MemoryStore: MemoryStore cleared
   22/02/19 20:25:47 INFO BlockManager: BlockManager stopped
   22/02/19 20:25:47 INFO BlockManagerMaster: BlockManagerMaster stopped
   22/02/19 20:25:47 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   22/02/19 20:25:47 INFO SparkContext: Successfully stopped SparkContext
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to