rdblue commented on a change in pull request #1890:
URL: https://github.com/apache/iceberg/pull/1890#discussion_r538766764



##########
File path: spark3/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -605,31 +605,41 @@ private static String 
sqlString(org.apache.iceberg.expressions.Literal<?> lit) {
   }
 
   public static CatalogAndIdentifier catalogAndIdentifier(SparkSession spark, 
String name) throws ParseException {
+    return catalogAndIdentifier(spark, name, 
spark.sessionState().catalogManager().currentCatalog());
+  }
+
+  public static CatalogAndIdentifier catalogAndIdentifier(SparkSession spark, 
String name,
+                                                          CatalogPlugin 
defaultCatalog) throws ParseException {
     ParserInterface parser = spark.sessionState().sqlParser();
     Seq<String> multiPartIdentifier = parser.parseMultipartIdentifier(name);
     List<String> javaMultiPartIdentifier = 
JavaConverters.seqAsJavaList(multiPartIdentifier);
-    return catalogAndIdentifier(spark, javaMultiPartIdentifier);
+    return catalogAndIdentifier(spark, javaMultiPartIdentifier, 
defaultCatalog);
+  }
+
+  public static CatalogAndIdentifier catalogAndIdentifier(SparkSession spark, 
List<String> nameParts) {
+    return catalogAndIdentifier(spark, nameParts, 
spark.sessionState().catalogManager().currentCatalog());
   }
 
   /**
    * A modified version of Spark's LookupCatalog.CatalogAndIdentifier.unapply
    * Attempts to find the catalog and identifier a multipart identifier 
represents
    * @param spark Spark session to use for resolution
    * @param nameParts Multipart identifier representing a table
+   * @param fallBackCatalog Catalog to use if none is specified
    * @return The CatalogPlugin and Identifier for the table
    */
-  public static CatalogAndIdentifier catalogAndIdentifier(SparkSession spark, 
List<String> nameParts) {
+  public static CatalogAndIdentifier catalogAndIdentifier(SparkSession spark, 
List<String> nameParts,
+                                                          CatalogPlugin 
fallBackCatalog) {
     Preconditions.checkArgument(!nameParts.isEmpty(),
         "Cannot determine catalog and Identifier from empty name parts");
     CatalogManager catalogManager = spark.sessionState().catalogManager();
-    CatalogPlugin currentCatalog = catalogManager.currentCatalog();
     String[] currentNamespace = catalogManager.currentNamespace();

Review comment:
       This namespace is associated with a catalog. I don't think this should 
use the catalog passed in with this namespace unless that catalog is the 
current catalog. I would do the following:
   
   1. If the fallback catalog is the current catalog, use the current namespace
   2. If the fallback catalog is not the current catalog, use its default 
namespace (`catalog.defaultNamespace()`)
   
   The catalog's default namespace is used when you switch to that catalog. The 
default namespace becomes the current namespace. Using the default fits with 
the idea that the fallback catalog is the current catalog for the context of 
the stored procedure.
   
   It this is difficult to implement, then we can always go back to using the 
current catalog rather than the procedure catalog.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to