[ 
https://issues.apache.org/jira/browse/SPARK-27772?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

William Wong updated SPARK-27772:
---------------------------------
    Description: 
The current `SQLTestUtils` created many `withXXX` utility functions to clean up 
tables/views/caches created for testing purpose. Some of those `withXXX` 
functions ignore certain exceptions, like `NoSuchTableException` in the clean 
up block (ie, the finally block). 

 
{code:java}
/**
 * Drops temporary view `viewNames` after calling `f`.
 */
protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
  try f finally {
    // If the test failed part way, we don't want to mask the failure by 
failing to remove
    // temp views that never got created.
    try viewNames.foreach(spark.catalog.dropTempView) catch {
      case _: NoSuchTableException =>
    }
  }
}
{code}
Maybe it is not the best approach. Because it is hard to anticipate what 
exception should or should not be ignored.  Java's `try-with-resources` 
statement does not mask exception throwing in the try block with any exception 
caught in the 'close()' statement. Exception caught in the 'close()' statement 
would add as a suppressed exception instead. IMHO, it is a better approach.  

 

Therefore, I proposed to standardise those 'withXXX' function with following 
`withFinallyBlock` function, which does something similar to Java's 
try-with-resources statement. 
{code:java}
/**
* Drops temporary view `viewNames` after calling `f`.
*/
protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
  withFinallyBlock(f)(viewNames.foreach(spark.catalog.dropTempView))
}


/**
 * Executes the given tryBlock and then the given finallyBlock no matter 
whether tryBlock throws
 * an exception. If both tryBlock and finallyBlock throw exceptions, the 
exception thrown
 * from the finallyBlock with be added to the exception thrown from tryBlock as 
a
 * suppress exception. It helps to avoid masking the exception from tryBlock 
with exception
 * from finallyBlock
 */
private def withFinallyBlock(tryBlock: => Unit)(finallyBlock: => Unit): Unit = {
  var fromTryBlock : Throwable = null
  try tryBlock catch {
    case cause : Throwable =>
      fromTryBlock = cause
      throw cause
  } finally {
    if (fromTryBlock != null) {
      try finallyBlock catch {
        case fromFinallyBlock : Throwable =>
          fromTryBlock.addSuppressed(fromFinallyBlock)
          throw fromTryBlock
      }
    } else {
      finallyBlock
    }
  }
}
{code}
 

If a feature is well written, we show not hit any exception in those closing 
method in testcase. The purpose of this proposal is to help developer to 
identify what may break in the test case. Swallowing original exception with 
any exception ( not just the missing table exception) throwing from those 
closing method, i believe it is not the best approach.

  was:
The current `SQLTestUtils` created many `withXXX` utility functions to clean up 
tables/views/caches created for testing purpose. Some of those `withXXX` 
functions ignore certain exceptions, like `NoSuchTableException` in the clean 
up block (ie, the finally block). 

 
{code:java}
/**
 * Drops temporary view `viewNames` after calling `f`.
 */
protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
  try f finally {
    // If the test failed part way, we don't want to mask the failure by 
failing to remove
    // temp views that never got created.
    try viewNames.foreach(spark.catalog.dropTempView) catch {
      case _: NoSuchTableException =>
    }
  }
}
{code}
Maybe it is not the best approach. Because it is hard to anticipate what 
exception should or should not be ignored.  Java's `try-with-resources` 
statement does not mask exception throwing in the try block with any exception 
caught in the 'close()' statement. Exception caught in the 'close()' statement 
would add as a suppressed exception instead. IMHO, it is a better approach.  

 

Therefore, I proposed to standardise those 'withXXX' function with following 
`withFinallyBlock` function, which does something similar to Java's 
try-with-resources statement. 
{code:java}
/**
* Drops temporary view `viewNames` after calling `f`.
*/
protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
  withFinallyBlock(f)(viewNames.foreach(spark.catalog.dropTempView))
}


/**
 * Executes the given tryBlock and then the given finallyBlock no matter 
whether tryBlock throws
 * an exception. If both tryBlock and finallyBlock throw exceptions, the 
exception thrown
 * from the finallyBlock with be added to the exception thrown from tryBlock as 
a
 * suppress exception. It helps to avoid masking the exception from tryBlock 
with exception
 * from finallyBlock
 */
private def withFinallyBlock(tryBlock: => Unit)(finallyBlock: => Unit): Unit = {
  var fromTryBlock : Throwable = null
  try tryBlock catch {
    case cause : Throwable =>
      fromTryBlock = cause
      throw cause
  } finally {
    if (fromTryBlock != null) {
      try finallyBlock catch {
        case fromFinallyBlock : Throwable =>
          fromTryBlock.addSuppressed(fromFinallyBlock)
          throw fromTryBlock
      }
    } else {
      finallyBlock
    }
  }
}
{code}
 


> SQLTestUtils Refactoring
> ------------------------
>
>                 Key: SPARK-27772
>                 URL: https://issues.apache.org/jira/browse/SPARK-27772
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: William Wong
>            Priority: Minor
>
> The current `SQLTestUtils` created many `withXXX` utility functions to clean 
> up tables/views/caches created for testing purpose. Some of those `withXXX` 
> functions ignore certain exceptions, like `NoSuchTableException` in the clean 
> up block (ie, the finally block). 
>  
> {code:java}
> /**
>  * Drops temporary view `viewNames` after calling `f`.
>  */
> protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
>   try f finally {
>     // If the test failed part way, we don't want to mask the failure by 
> failing to remove
>     // temp views that never got created.
>     try viewNames.foreach(spark.catalog.dropTempView) catch {
>       case _: NoSuchTableException =>
>     }
>   }
> }
> {code}
> Maybe it is not the best approach. Because it is hard to anticipate what 
> exception should or should not be ignored.  Java's `try-with-resources` 
> statement does not mask exception throwing in the try block with any 
> exception caught in the 'close()' statement. Exception caught in the 
> 'close()' statement would add as a suppressed exception instead. IMHO, it is 
> a better approach.  
>  
> Therefore, I proposed to standardise those 'withXXX' function with following 
> `withFinallyBlock` function, which does something similar to Java's 
> try-with-resources statement. 
> {code:java}
> /**
> * Drops temporary view `viewNames` after calling `f`.
> */
> protected def withTempView(viewNames: String*)(f: => Unit): Unit = {
>   withFinallyBlock(f)(viewNames.foreach(spark.catalog.dropTempView))
> }
> /**
>  * Executes the given tryBlock and then the given finallyBlock no matter 
> whether tryBlock throws
>  * an exception. If both tryBlock and finallyBlock throw exceptions, the 
> exception thrown
>  * from the finallyBlock with be added to the exception thrown from tryBlock 
> as a
>  * suppress exception. It helps to avoid masking the exception from tryBlock 
> with exception
>  * from finallyBlock
>  */
> private def withFinallyBlock(tryBlock: => Unit)(finallyBlock: => Unit): Unit 
> = {
>   var fromTryBlock : Throwable = null
>   try tryBlock catch {
>     case cause : Throwable =>
>       fromTryBlock = cause
>       throw cause
>   } finally {
>     if (fromTryBlock != null) {
>       try finallyBlock catch {
>         case fromFinallyBlock : Throwable =>
>           fromTryBlock.addSuppressed(fromFinallyBlock)
>           throw fromTryBlock
>       }
>     } else {
>       finallyBlock
>     }
>   }
> }
> {code}
>  
> If a feature is well written, we show not hit any exception in those closing 
> method in testcase. The purpose of this proposal is to help developer to 
> identify what may break in the test case. Swallowing original exception with 
> any exception ( not just the missing table exception) throwing from those 
> closing method, i believe it is not the best approach.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to