Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16451#discussion_r94932681
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1485,17 +1485,18 @@ private[spark] object Utils extends Logging {
       /** Return uncompressed file length of a compressed file. */
       private def getCompressedFileLength(file: File): Long = {
         try {
    -      // Uncompress .gz file to determine file size.
    -      var fileSize = 0L
    -      val gzInputStream = new GZIPInputStream(new FileInputStream(file))
    -      val bufSize = 1024
    -      val buf = new Array[Byte](bufSize)
    -      var numBytes = ByteStreams.read(gzInputStream, buf, 0, bufSize)
    -      while (numBytes > 0) {
    -        fileSize += numBytes
    -        numBytes = ByteStreams.read(gzInputStream, buf, 0, bufSize)
    +      tryWithResource(new GZIPInputStream(new FileInputStream(file))) { 
gzInputStream =>
    --- End diff --
    
    I'm OK with this. I suppose we have nested try statements here now, where 
it could be written as one. Really, the catch clause below isn't that helpful. 
It logs an error, when the error isn't necessarily fatal, but throws the 
exception anyway. You could even dispense with the existing try-catch here, 
IMHO.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to