[GitHub] [spark] jzhuge commented on a change in pull request #25372: [SPARK-28640][SQL] Only give warning when session catalog is not defined
jzhuge commented on a change in pull request #25372: [SPARK-28640][SQL] Only give warning when session catalog is not defined URL: https://github.com/apache/spark/pull/25372#discussion_r312784188 ## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalog/v2/LookupCatalog.scala ## @@ -62,6 +62,9 @@ trait LookupCatalog extends Logging { try { Some(lookupCatalog(SESSION_CATALOG_NAME)) } catch { + case _: CatalogNotFoundException => +logWarning("Session catalog is not defined") +None Review comment: @dongjoon-hyun Thanks for the review. Your command line is not the case I tried to fix in the PR. In your case, the stack trace is helpful. It seems that the current master has session catalog defined by default, so here is the command line to reproduce my case: ``` $ bin/spark-shell --master 'local[*]' --conf spark.sql.catalog.session= ... Spark context available as 'sc' (master = local[*], app id = local-1565588237201). Spark session available as 'spark'. ... scala> spark.sessionState.analyzer.sessionCatalog ... 2019-08-11 22:37:24,216 ERROR [main] hive.HiveSessionStateBuilder$$anon$1 (Logging.scala:logError(94)) - Cannot load v2 session catalog org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'session': at org.apache.spark.sql.catalog.v2.Catalogs.load(Catalogs.java:81) ... res0: Option[org.apache.spark.sql.catalog.v2.CatalogPlugin] = None ``` Here the stack trace does not add more information. And I am concerned that if any rule uses session catalog, we will see this long stack trace again and again. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] jzhuge commented on a change in pull request #25372: [SPARK-28640][SQL] Only give warning when session catalog is not defined
jzhuge commented on a change in pull request #25372: [SPARK-28640][SQL] Only give warning when session catalog is not defined URL: https://github.com/apache/spark/pull/25372#discussion_r312784188 ## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalog/v2/LookupCatalog.scala ## @@ -62,6 +62,9 @@ trait LookupCatalog extends Logging { try { Some(lookupCatalog(SESSION_CATALOG_NAME)) } catch { + case _: CatalogNotFoundException => +logWarning("Session catalog is not defined") +None Review comment: @dongjoon-hyun Thanks for the review. Your command line is not the case I tried to fix in the PR. In your case, the stack trace is helpful. It seems that the current master has session catalog defined by default, so here is the command line to reproduce my case: ``` $ bin/spark-shell --master 'local[*]' --conf spark.sql.catalog.session= ... Spark context available as 'sc' (master = local[*], app id = local-1565588237201). Spark session available as 'spark'. ... scala> spark.sessionState.analyzer.sessionCatalog ... 2019-08-11 22:37:24,216 ERROR [main] hive.HiveSessionStateBuilder$$anon$1 (Logging.scala:logError(94)) - Cannot load v2 session catalog org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'session': at org.apache.spark.sql.catalog.v2.Catalogs.load(Catalogs.java:81) ... res0: Option[org.apache.spark.sql.catalog.v2.CatalogPlugin] = None ``` Here is the stack trace does not add more information. And I am concerned that if any rule uses session catalog, we will see this long stack trace again and again. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org