Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/22566#discussion_r220978815
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/command/AnalyzeColumnCommand.scala
---
@@ -50,7 +52,26 @@ case class AnalyzeColumnCommand(
val sizeInBytes = CommandUtils.calculateTotalSize(sparkSession,
tableMeta)
// Compute stats for each column
- val (rowCount, newColStats) = computeColumnStats(sparkSession,
tableIdentWithDB, columnNames)
+ val conf = sparkSession.sessionState.conf
+ val relation = sparkSession.table(tableIdent).logicalPlan
+ val attributesToAnalyze = if (allColumns) {
+ relation.output
--- End diff --
Are we still able to create a table with zero column? for example, using
dataframewriter?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]