szehon-ho commented on code in PR #49840:
URL: https://github.com/apache/spark/pull/49840#discussion_r1945607596


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##########
@@ -320,6 +331,67 @@ object ResolveDefaultColumns extends QueryErrorsBase
     coerceDefaultValue(analyzed, dataType, statementType, colName, defaultSQL)
   }
 
+  /**
+   * Analyze EXISTS_DEFAULT value.  This skips some steps of analyze as most 
of the
+   * analysis has been done before.
+   *
+   * VisibleForTesting
+   */
+  def analyzeExistingDefault(field: StructField,

Review Comment:
   This is simpler version of analyze.  We may have an opportunity to simplify 
further, but for now it seems some part of analysis is still needed to resolve 
some functions like array().  The problematic code of FinishAnalysis was 
removed though.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##########
@@ -52,6 +52,17 @@ object ResolveDefaultColumns extends QueryErrorsBase
   // CURRENT_DEFAULT_COLUMN_METADATA.
   val CURRENT_DEFAULT_COLUMN_NAME = "DEFAULT"
 
+  var defaultColumnAnalyzer: Analyzer = DefaultColumnAnalyzer
+  var defaultColumnOptimizer: Optimizer = DefaultColumnOptimizer
+
+  /**
+   * Visible for testing
+   */
+  def setAnalyzerAndOptimizer(analyzer: Analyzer, optimizer: Optimizer): Unit 
= {

Review Comment:
   Its hard to reproduce the issue in unit test, so I end up mocking these 
members to verify that the catalogs are not called.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to