Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/20179#discussion_r160313840
--- Diff: core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala
---
@@ -376,18 +374,13 @@ private[netty] class NettyRpcEnv(
def setError(e: Throwable): Unit = {
error = e
- source.close()
}
override def read(dst: ByteBuffer): Int = {
--- End diff --
Yes. This currently happens in two places:
- In Utils.doFetchFile():
https://github.com/apache/spark/blob/28315714ddef3ddcc192375e98dd5207cf4ecc98/core/src/main/scala/org/apache/spark/util/Utils.scala#L661:
the stream gets passed a couple of layers down to a
`Utils.copyStream(closeStreams = true)` call which is guaranteed to clean up
the stream.
- In ExecutorClassLoader, where we construct the stream in `fetchFn` from
`getclassFileInputStreamFromRpc` and then close it in a `finally` block in
`findClassLocally`:
https://github.com/apache/spark/blob/e08d06b37bc96cc48fec1c5e40f73e0bca09c616/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala#L167
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]