HeartSaVioR edited a comment on pull request #33682:
URL: https://github.com/apache/spark/pull/33682#issuecomment-917839915


   Hello,
   
   `JavaUtils.closeQuietly` looks to be slightly different from 
`IOUtils.closeQuietly` in default behavior. Swallowing IOException is same, but 
the former logs it as ERROR while the latter doesn't log by default. While it 
might be arguable which one is the right behavior, there seems to be some case 
we know IOException could happen but no big deal, like cancelling 
delta/metadata file in CancellableFSDataOutputStream, which we don't like to 
produce error log message.
   
   ```
   [info] 12:24:05.252 ERROR org.apache.spark.network.util.JavaUtils: 
IOException should not have been thrown.
   [info] java.nio.channels.ClosedChannelException
   [info]       at 
org.apache.hadoop.fs.ChecksumFs$ChecksumFSOutputSummer.checkClosed(ChecksumFs.java:390)
   [info]       at 
org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:106)
   [info]       at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]       at java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]       at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]       at java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]       at 
com.github.luben.zstd.ZstdOutputStreamNoFinalizer.flush(ZstdOutputStreamNoFinalizer.java:249)
   [info]       at 
java.io.BufferedOutputStream.flush(BufferedOutputStream.java:141)
   [info]       at java.io.DataOutputStream.flush(DataOutputStream.java:123)
   [info]       at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
   [info]       at 
org.apache.spark.network.util.JavaUtils.closeQuietly(JavaUtils.java:53)
   [info]       at 
org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.org$apache$spark$sql$execution$streaming$state$HDFSBackedStateStoreProvider$$cancelDeltaFile(HDFSBackedStateStoreProvider.scala:545)
   [info]       at 
org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$HDFSBackedStateStore.abort(HDFSBackedStateStoreProvider.scala:158)
   [info]       at 
org.apache.spark.sql.execution.benchmark.StateStoreBenchmark$.$anonfun$runBenchmarkSuite$3(StateStoreBenchmark.scala:95)
   [info]       at 
org.apache.spark.sql.execution.benchmark.StateStoreBenchmark$.$anonfun$runBenchmarkSuite$3$adapted(StateStoreBenchmark.scala:84)
   [info]       at 
org.apache.spark.benchmark.Benchmark.measure(Benchmark.scala:149)
   [info]       at 
org.apache.spark.benchmark.Benchmark.$anonfun$run$1(Benchmark.scala:106)
   [info]       at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
   [info]       at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
   [info]       at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
   [info]       at 
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
   [info]       at 
scala.collection.TraversableLike.map(TraversableLike.scala:286)
   [info]       at 
scala.collection.TraversableLike.map$(TraversableLike.scala:279)
   [info]       at 
scala.collection.AbstractTraversable.map(Traversable.scala:108)
   [info]       at org.apache.spark.benchmark.Benchmark.run(Benchmark.scala:104)
   [info]       at 
org.apache.spark.sql.execution.benchmark.StateStoreBenchmark$.$anonfun$runBenchmarkSuite$1(StateStoreBenchmark.scala:112)
   [info]       at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
   [info]       at 
org.apache.spark.benchmark.BenchmarkBase.runBenchmark(BenchmarkBase.scala:42)
   [info]       at 
org.apache.spark.sql.execution.benchmark.StateStoreBenchmark$.runBenchmarkSuite(StateStoreBenchmark.scala:58)
   [info]       at 
org.apache.spark.benchmark.BenchmarkBase.main(BenchmarkBase.scala:72)
   [info]       at 
org.apache.spark.sql.execution.benchmark.StateStoreBenchmark.main(StateStoreBenchmark.scala)
   [info]       Suppressed: java.nio.channels.ClosedChannelException
   [info]               at 
org.apache.hadoop.fs.ChecksumFs$ChecksumFSOutputSummer.checkClosed(ChecksumFs.java:390)
   [info]               at 
org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:106)
   [info]               at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]               at 
java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]               at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]               at 
java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]               at 
com.github.luben.zstd.ZstdOutputStreamNoFinalizer.flush(ZstdOutputStreamNoFinalizer.java:249)
   [info]               at 
java.io.BufferedOutputStream.flush(BufferedOutputStream.java:141)
   [info]               at 
java.io.FilterOutputStream.close(FilterOutputStream.java:158)
   [info]               at 
java.io.FilterOutputStream.close(FilterOutputStream.java:159)
   [info]               ... 21 more
   [info]               Suppressed: java.nio.channels.ClosedChannelException
   [info]                       at 
org.apache.hadoop.fs.ChecksumFs$ChecksumFSOutputSummer.checkClosed(ChecksumFs.java:390)
   [info]                       at 
org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:106)
   [info]                       at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]                       at 
java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]                       at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:62)
   [info]                       at 
java.io.DataOutputStream.write(DataOutputStream.java:107)
   [info]                       at 
com.github.luben.zstd.ZstdOutputStreamNoFinalizer.close(ZstdOutputStreamNoFinalizer.java:279)
   [info]                       at 
com.github.luben.zstd.ZstdOutputStreamNoFinalizer.close(ZstdOutputStreamNoFinalizer.java:258)
   [info]                       at 
java.io.FilterOutputStream.close(FilterOutputStream.java:159)
   [info]                       ... 22 more
   ```
   
   Furthermore, looks like commons-io community decided to retain the method, 
and removed deprecated annotation.
   
   
https://github.com/apache/commons-io/blob/rel/commons-io-2.8.0/src/main/java/org/apache/commons/io/IOUtils.java#L399-L402
   
   
https://github.com/apache/commons-io/blob/75f20dca72656225d0dc8e7c982e40caa9277d42/src/main/java/org/apache/commons/io/IOUtils.java#L465-L467


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to