imback82 commented on a change in pull request #32447:
URL: https://github.com/apache/spark/pull/32447#discussion_r627928708
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/Command.scala
##########
@@ -46,5 +47,6 @@ trait AnalysisOnlyCommand extends Command {
val isAnalyzed: Boolean
def childrenToAnalyze: Seq[LogicalPlan]
override final def children: Seq[LogicalPlan] = if (isAnalyzed) Nil else
childrenToAnalyze
+ override def innerChildren: Seq[QueryPlan[_]] = if (isAnalyzed)
childrenToAnalyze else Nil
Review comment:
There is a change, but I think it's for better:
Before:
```
== Parsed Logical Plan ==
'CacheTableAsSelect tempTable, SELECT key FROM testData, false, false
+- 'Project ['key]
+- 'UnresolvedRelation [testData], [], false
== Analyzed Logical Plan ==
CacheTableAsSelect tempTable, Project [key#13], SELECT key FROM testData,
false, true
== Optimized Logical Plan ==
CacheTableAsSelect tempTable, Project [key#13], SELECT key FROM testData,
false, true
== Physical Plan ==
CacheTableAsSelect tempTable, Project [key#13], SELECT key FROM testData,
false
```
New:
```
== Parsed Logical Plan ==
'CacheTableAsSelect tempTable, SELECT key FROM testData, false, false
+- 'Project ['key]
+- 'UnresolvedRelation [testData], [], false
== Analyzed Logical Plan ==
CacheTableAsSelect tempTable, SELECT key FROM testData, false, true
+- Project [key#13]
+- SubqueryAlias testdata
+- View (`testData`, [key#13,value#14])
+- SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).key AS key#13,
staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType,
fromString, knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).value, true, false) AS
value#14]
+- ExternalRDD [obj#12]
== Optimized Logical Plan ==
CacheTableAsSelect tempTable, SELECT key FROM testData, false, true
+- Project [key#13]
+- SubqueryAlias testdata
+- View (`testData`, [key#13,value#14])
+- SerializeFromObject [knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).key AS key#13,
staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType,
fromString, knownnotnull(assertnotnull(input[0,
org.apache.spark.sql.test.SQLTestData$TestData, true])).value, true, false) AS
value#14]
+- ExternalRDD [obj#12]
== Physical Plan ==
CacheTableAsSelect tempTable, Project [key#13], SELECT key FROM testData,
false
```
WDYT?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]