rdblue commented on a change in pull request #25330: [SPARK-28565][SQL] 
DataFrameWriter saveAsTable support for V2 catalogs
URL: https://github.com/apache/spark/pull/25330#discussion_r312296215
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala
 ##########
 @@ -485,7 +490,71 @@ final class DataFrameWriter[T] private[sql](ds: 
Dataset[T]) {
    * @since 1.4.0
    */
   def saveAsTable(tableName: String): Unit = {
-    
saveAsTable(df.sparkSession.sessionState.sqlParser.parseTableIdentifier(tableName))
+    import df.sparkSession.sessionState.analyzer.{AsTableIdentifier, 
CatalogObjectIdentifier}
+    import org.apache.spark.sql.catalog.v2.CatalogV2Implicits._
+
+    import org.apache.spark.sql.catalog.v2.CatalogV2Implicits._
+    val session = df.sparkSession
+
+    session.sessionState.sqlParser.parseMultipartIdentifier(tableName) match {
+      case CatalogObjectIdentifier(Some(catalog), ident) =>
+        saveAsTable(catalog.asTableCatalog, ident, modeForDSV2)
+
+      case AsTableIdentifier(tableIdentifier) =>
+        saveAsTable(tableIdentifier)
+
+      case other =>
+        // TODO(SPARK-28666): This should go through V2SessionCatalog
+        throw new UnsupportedOperationException(
+          s"Couldn't find a catalog to handle the identifier ${other.quoted}.")
 
 Review comment:
   I think this is an analysis error. The catalog was None, so it belongs to 
the session catalog. But the session catalog doesn't support namespaces with 
more than one part, so it is an invalid identifier for that catalog.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to