ryan-johnson-databricks commented on code in PR #40732:
URL: https://github.com/apache/spark/pull/40732#discussion_r1165899535


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveDefaultColumns.scala:
##########
@@ -47,9 +47,11 @@ import org.apache.spark.sql.types._
  * (1, 5)
  * (4, 6)
  *
- * @param catalog  the catalog to use for looking up the schema of INSERT INTO 
table objects.
+ * @param resolveRelation function to resolve relations from the catalog. This 
should generally map
+ *                        to the 'resolveRelationOrTempView' method of the 
ResolveRelations rule.
  */
-case class ResolveDefaultColumns(catalog: SessionCatalog) extends 
Rule[LogicalPlan] {
+case class ResolveDefaultColumns(
+    resolveRelation: UnresolvedRelation => LogicalPlan) extends 
Rule[LogicalPlan] {

Review Comment:
   Why can't `ResolveDefaultColumns` just call 
`ResolveRelations.resolveRelationOrTempView` and be done with it? It's a public 
method, after all... and then the unit test wouldn't need to pass `null` any 
more (which is already quite weird). 
   
   Do we ever anticipate a need to use a different resolver instead?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to