cloud-fan commented on code in PR #36641:
URL: https://github.com/apache/spark/pull/36641#discussion_r897689496
##########
sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala:
##########
@@ -250,8 +251,18 @@ class CatalogImpl(sparkSession: SparkSession) extends
Catalog {
* table/view. This throws an `AnalysisException` when no `Table` can be
found.
*/
override def getTable(tableName: String): Table = {
- val tableIdent =
sparkSession.sessionState.sqlParser.parseTableIdentifier(tableName)
- getTable(tableIdent.database.orNull, tableIdent.table)
+ // calling `sqlParser.parseTableIdentifier` to parse tableName. If it
contains only table name
+ // and optionally contains a database name(thus a TableIdentifier), then
that is used to get
+ // the table. Otherwise we try `sqlParser.parseMultipartIdentifier` to
have a sequence of string
Review Comment:
TBH it's a bit weird that this scala API does not respect the current
catalog, while SQL API does.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]