rdblue commented on a change in pull request #24570: [SPARK-24923][SQL]
Implement v2 CreateTableAsSelect
URL: https://github.com/apache/spark/pull/24570#discussion_r283891802
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala
##########
@@ -46,14 +55,22 @@ case class DataSourceResolution(conf: SQLConf) extends
Rule[LogicalPlan] with Ca
CreateTable(tableDesc, mode, None)
case CreateTableAsSelectStatement(
- table, query, partitionCols, bucketSpec, properties,
V1WriteProvider(provider), options,
- location, comment, ifNotExists) =>
+ AsTableIdentifier(table), query, partitionCols, bucketSpec, properties,
+ V1WriteProvider(provider), options, location, comment, ifNotExists) =>
val tableDesc = buildCatalogTable(table, new StructType, partitionCols,
bucketSpec,
properties, provider, options, location, comment, ifNotExists)
val mode = if (ifNotExists) SaveMode.Ignore else SaveMode.ErrorIfExists
CreateTable(tableDesc, mode, Some(query))
+
+ case create: CreateTableAsSelectStatement =>
+ // the provider was not a v1 source, convert to a v2 plan
+ val CatalogObjectIdentifier(maybeCatalog, identifier) = create.tableName
+ val catalog = maybeCatalog
+ .getOrElse(throw new AnalysisException("Default catalog is not set"))
Review comment:
No, this is the correct error message. When no explicit catalog is used,
Spark should use the default catalog. When there is no catalog, the problem is
that there was no default set.
This is why the default catalog was originally part of this PR. The errors
depend on it.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]