dongjoon-hyun commented on code in PR #37105:
URL: https://github.com/apache/spark/pull/37105#discussion_r925165334


##########
sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala:
##########
@@ -101,6 +101,9 @@ abstract class Catalog {
   /**
    * Returns a list of columns for the given table/view in the specified 
database.
    *
+   * This API does not support 3 layer namespace since 3.4.0. To use 3 layer 
namespace,

Review Comment:
   Just a question to @cloud-fan because I agree with you that that naming is 
confusing.
   > 3 layer namespace is a bit confusing
   
   I've monitored many commits in the community.
   
   ```
   a2c1038031 [SPARK-39579][SQL][PYTHON][R] Make 
ListFunctions/getFunction/functionExists compatible with 3 layer namespace
   6e7a571532 [SPARK-39649][PYTHON] Make listDatabases / getDatabase / 
listColumns / refreshTable in PySpark support 3-layer-namespace
   cbb4e7da69 [SPARK-39646][SQL] Make setCurrentDatabase compatible with 3 
layer namespace
   b0d297c6d1 [SPARK-39645][SQL] Make getDatabase and listDatabases compatible 
with 3 layer namespace
   8c02823b49 [SPARK-39583][SQL] Make RefreshTable be compatible with 3 layer 
namespace
   ed1a3402d2 [SPARK-39598][PYTHON] Make *cache*, *catalog* in the python side 
support 3-layer-namespace
   c1106fbe22 [SPARK-39597][PYTHON] Make GetTable, TableExists and 
DatabaseExists in the python side support 3-layer-namespace
   1f15f2c6ad [SPARK-39615][SQL] Make listColumns be compatible with 3 layer 
namespace
   b2d249b1aa [SPARK-39555][PYTHON] Make createTable and listTables in the 
python side support 3-layer-namespace
   ca5f7e6c35 [SPARK-39263][SQL] Make GetTable, TableExists and DatabaseExists 
be compatible with 3 layer namespace
   cb55efadea [SPARK-39236][SQL] Make CreateTable and ListTables be compatible 
with 3 layer namespace
   ```
   
   Comments like this.
   
   
https://github.com/apache/spark/blob/0cc96f76d8a4858aee09e1fa32658da3ae76d384/python/pyspark/sql/catalog.py#L464
   
   Even in function naming like this.
   
   
https://github.com/apache/spark/blob/0cc96f76d8a4858aee09e1fa32658da3ae76d384/sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala#L424
   
   I believe we need a naming rule for this to promote new naming or demote it 
by preventing further usage. Which way do you prefer, @cloud-fan ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to