[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-10 Thread jinxing64
Github user jinxing64 closed the pull request at:

https://github.com/apache/spark/pull/19086


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread jinxing64
Github user jinxing64 commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136747432
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -569,46 +569,51 @@ class SessionCatalog(
   /**
* Rename a table.
*
-   * If a database is specified in `oldName`, this will rename the table 
in that database.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
+   *
* If no database is specified, this will first attempt to rename a 
temporary table with
-   * the same name, then, if that does not exist, rename the table in the 
current database.
+   * the same name, then, if that does not exist, current database will be 
used for locating
+   * `oldName` or `newName`.
--- End diff --

So should I make the comment here unchanged?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136747159
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -418,6 +439,42 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 }
   }
 
+  test("rename global temp view") {
+withBasicCatalog { catalog =>
+  val globalTempTable = Range(1, 10, 2, 10)
+  catalog.createGlobalTempView("tbl1", globalTempTable, 
overrideIfExists = false)
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
+  // If database is not specified, global temp view will not be renamed
+  catalog.setCurrentDatabase("db1")
+  val thrown1 = intercept[AnalysisException] {
+catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  }
+  assert(thrown1.getMessage.contains("Table or view 'tbl1' not found 
in database 'db1'"))
+  catalog.setCurrentDatabase("db2")
+  catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "tbl2"))
+  // Moving global temp view to another database is forbidden
+  val thrown2 = intercept[AnalysisException] {
+catalog.renameTable(
+  TableIdentifier("tbl1", Some("global_temp")), 
TableIdentifier("tbl3", Some("db2")))
+  }
+  assert(thrown2.getMessage.contains("Cannot change database of table 
'tbl1'"))
+  // Moving table from regular database to be a global temp view is 
forbidden
+  val thrown3 = intercept[AnalysisException] {
+catalog.renameTable(
+  TableIdentifier("tbl2", Some("db2")), TableIdentifier("tbltwo", 
Some("global_temp")))
+  }
--- End diff --

 Normally, we put `.getMessage` here. Thus, we can make the next line 
shorter. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136745765
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -569,46 +569,51 @@ class SessionCatalog(
   /**
* Rename a table.
*
-   * If a database is specified in `oldName`, this will rename the table 
in that database.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
+   *
* If no database is specified, this will first attempt to rename a 
temporary table with
-   * the same name, then, if that does not exist, rename the table in the 
current database.
+   * the same name, then, if that does not exist, current database will be 
used for locating
+   * `oldName` or `newName`.
--- End diff --

This is wrong, right? This is for `temp view`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136745597
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -569,46 +569,51 @@ class SessionCatalog(
   /**
* Rename a table.
*
-   * If a database is specified in `oldName`, this will rename the table 
in that database.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
--- End diff --

`It will result` -> `it results`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136725094
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -573,28 +573,29 @@ class SessionCatalog(
* If no database is specified, this will first attempt to rename a 
temporary table with
* the same name, then, if that does not exist, rename the table in the 
current database.
*
-   * This assumes the database specified in `newName` matches the one in 
`oldName`.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(currentDb))
--- End diff --

Also add a comment to explain the behavior in the above function 
description. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136725012
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -573,28 +573,29 @@ class SessionCatalog(
* If no database is specified, this will first attempt to rename a 
temporary table with
* the same name, then, if that does not exist, rename the table in the 
current database.
*
-   * This assumes the database specified in `newName` matches the one in 
`oldName`.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(currentDb))
 
 val oldTableName = formatTableName(oldName.table)
 val newTableName = formatTableName(newName.table)
-if (db == globalTempViewManager.database) {
+if (oldDb == globalTempViewManager.database || newDb == 
globalTempViewManager.database) {
+  if (oldDb != newDb) {
+throw new AnalysisException(
+  s"Cannot change database of table '$oldTableName'")
+  }
   globalTempViewManager.rename(oldTableName, newTableName)
 } else {
-  requireDbExists(db)
   if (oldName.database.isDefined || 
!tempTables.contains(oldTableName)) {
--- End diff --

change `else { if` to `else if(`.
Also add a comment here 
> ` // when the old table is a regular table/view`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724918
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -418,6 +436,39 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 }
   }
 
+  test("rename global temp table") {
+withBasicCatalog { catalog =>
+  val globalTempTable = Range(1, 10, 2, 10)
+  catalog.createGlobalTempView("tbl1", globalTempTable, 
overrideIfExists = false)
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
+  // If database is not specified, global temp table will not be 
renamed
+  catalog.setCurrentDatabase("db1")
+  intercept[AnalysisException] {
+catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  }
+  catalog.setCurrentDatabase("db2")
+  catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "tbl2"))
+  // Moving global temp table to another database is forbidden
+  intercept[AnalysisException] {
+catalog.renameTable(
+  TableIdentifier("tbl1", Some("global_temp")), 
TableIdentifier("tbl3", Some("db2")))
+  }
+  // Moving table from database to be a global temp table is forbidden
+  intercept[AnalysisException] {
--- End diff --

Capture the error message and add an assert.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724986
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -357,25 +357,43 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 
   test("rename table") {
 withBasicCatalog { catalog =>
+  catalog.setCurrentDatabase("db2")
   assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
   catalog.renameTable(TableIdentifier("tbl1", Some("db2")), 
TableIdentifier("tblone"))
   assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "tbl2"))
   catalog.renameTable(TableIdentifier("tbl2", Some("db2")), 
TableIdentifier("tbltwo"))
   assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "tbltwo"))
-  // Rename table without explicitly specifying database
+
+  // Current database will be used when rename table without 
explicitly specifying database
   catalog.setCurrentDatabase("db2")
   catalog.renameTable(TableIdentifier("tbltwo"), 
TableIdentifier("table_two"))
   assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "table_two"))
-  // Renaming "db2.tblone" to "db1.tblones" should fail because 
databases don't match
+
+  // Table will be moved across databases when its database is 
different from the destination.
+  catalog.renameTable(
+TableIdentifier("table_two", None), TableIdentifier("tabletwo", 
Some("db1")))
+  assert(catalog.externalCatalog.listTables("db1").toSet == 
Set("tabletwo"))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone"))
+
+  catalog.renameTable(
+TableIdentifier("tabletwo", Some("db1")), 
TableIdentifier("table_two", None))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "table_two"))
+
+  catalog.renameTable(
+TableIdentifier("tblone", Some("db2")), TableIdentifier("tblone", 
Some("db1")))
+  assert(catalog.externalCatalog.listTables("db1").toSet == 
Set("tblone"))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("table_two"))
+
+  // Renaming "db2.tblone" to "db1.tblones" should fail because table 
cannot be found.
   intercept[AnalysisException] {
--- End diff --

add an assert to check the message.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724969
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -502,17 +502,16 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
 val storageWithNewPath = if (rawTable.tableType == MANAGED && 
hasPathOption) {
   // If it's a managed table with path option and we are renaming it, 
then the path option
   // becomes inaccurate and we need to update it according to the new 
table name.
-  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(db)))
+  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(newDb)))
--- End diff --

This is ok


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724919
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -418,6 +436,39 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 }
   }
 
+  test("rename global temp table") {
+withBasicCatalog { catalog =>
+  val globalTempTable = Range(1, 10, 2, 10)
+  catalog.createGlobalTempView("tbl1", globalTempTable, 
overrideIfExists = false)
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
+  // If database is not specified, global temp table will not be 
renamed
+  catalog.setCurrentDatabase("db1")
+  intercept[AnalysisException] {
+catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  }
+  catalog.setCurrentDatabase("db2")
+  catalog.renameTable(TableIdentifier("tbl1"), 
TableIdentifier("tblone"))
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tblone", "tbl2"))
+  // Moving global temp table to another database is forbidden
+  intercept[AnalysisException] {
--- End diff --

The same here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136723572
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -573,28 +573,29 @@ class SessionCatalog(
* If no database is specified, this will first attempt to rename a 
temporary table with
* the same name, then, if that does not exist, rename the table in the 
current database.
*
-   * This assumes the database specified in `newName` matches the one in 
`oldName`.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(currentDb))
 
 val oldTableName = formatTableName(oldName.table)
 val newTableName = formatTableName(newName.table)
-if (db == globalTempViewManager.database) {
+if (oldDb == globalTempViewManager.database || newDb == 
globalTempViewManager.database) {
+  if (oldDb != newDb) {
+throw new AnalysisException(
+  s"Cannot change database of table '$oldTableName'")
+  }
   globalTempViewManager.rename(oldTableName, newTableName)
 } else {
-  requireDbExists(db)
   if (oldName.database.isDefined || 
!tempTables.contains(oldTableName)) {
-requireTableExists(TableIdentifier(oldTableName, Some(db)))
-requireTableNotExists(TableIdentifier(newTableName, Some(db)))
+requireDbExists(oldDb)
+requireDbExists(newDb)
+requireTableExists(TableIdentifier(oldTableName, Some(oldDb)))
+requireTableNotExists(TableIdentifier(newTableName, Some(newDb)))
 validateName(newTableName)
-externalCatalog.renameTable(db, oldTableName, newTableName)
+externalCatalog.renameTable(oldDb, oldTableName, newDb, 
newTableName)
   } else {
 if (newName.database.isDefined) {
--- End diff --

Add a comment 
> // when the old table is a non-global temporary view


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724927
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -418,6 +436,39 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 }
   }
 
+  test("rename global temp table") {
+withBasicCatalog { catalog =>
+  val globalTempTable = Range(1, 10, 2, 10)
+  catalog.createGlobalTempView("tbl1", globalTempTable, 
overrideIfExists = false)
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
+  // If database is not specified, global temp table will not be 
renamed
+  catalog.setCurrentDatabase("db1")
+  intercept[AnalysisException] {
--- End diff --

The same here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136723544
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -573,28 +573,29 @@ class SessionCatalog(
* If no database is specified, this will first attempt to rename a 
temporary table with
* the same name, then, if that does not exist, rename the table in the 
current database.
*
-   * This assumes the database specified in `newName` matches the one in 
`oldName`.
+   * If the database specified in `newName` is different from the one 
specified in `oldName`,
+   * It will result in moving table across databases.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(currentDb))
 
 val oldTableName = formatTableName(oldName.table)
 val newTableName = formatTableName(newName.table)
-if (db == globalTempViewManager.database) {
+if (oldDb == globalTempViewManager.database || newDb == 
globalTempViewManager.database) {
+  if (oldDb != newDb) {
--- End diff --

Add comment above
> // when either old table or new table is a global temporary view


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136724965
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
 ---
@@ -418,6 +436,39 @@ abstract class SessionCatalogSuite extends 
AnalysisTest {
 }
   }
 
+  test("rename global temp table") {
+withBasicCatalog { catalog =>
+  val globalTempTable = Range(1, 10, 2, 10)
+  catalog.createGlobalTempView("tbl1", globalTempTable, 
overrideIfExists = false)
+  assert(catalog.getGlobalTempView("tbl1") == Option(globalTempTable))
+  assert(catalog.externalCatalog.listTables("db2").toSet == 
Set("tbl1", "tbl2"))
+  // If database is not specified, global temp table will not be 
renamed
--- End diff --

`global temp table` -> `global temp views`. This is a general comment. We 
do not have `global temp table`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-03 Thread jinxing64
Github user jinxing64 commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136719337
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -502,17 +502,16 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
 val storageWithNewPath = if (rawTable.tableType == MANAGED && 
hasPathOption) {
   // If it's a managed table with path option and we are renaming it, 
then the path option
   // becomes inaccurate and we need to update it according to the new 
table name.
-  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(db)))
+  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(newDb)))
--- End diff --


https://github.com/apache/spark/pull/19086/files#diff-8c4108666a6639034f0ddbfa075bcb37R827
Is this ok?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136644147
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -502,17 +502,16 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
 val storageWithNewPath = if (rawTable.tableType == MANAGED && 
hasPathOption) {
   // If it's a managed table with path option and we are renaming it, 
then the path option
   // becomes inaccurate and we need to update it according to the new 
table name.
-  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(db)))
+  val newTablePath = defaultTablePath(TableIdentifier(newName, 
Some(newDb)))
--- End diff --

Please add a test case to check whether Hive move the table directory 
crosses the database


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136643478
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(oldDb))
 
 val oldTableName = formatTableName(oldName.table)
 val newTableName = formatTableName(newName.table)
-if (db == globalTempViewManager.database) {
+if (oldDb == globalTempViewManager.database || newDb == 
globalTempViewManager.database) {
+  if (oldDb != newDb) {
+throw new AnalysisException(
+  s"Cannot change database of table '$oldTableName'")
--- End diff --

Add test cases for both scenarios?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136643356
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(oldDb))
--- End diff --

Please double check whether Hive behaves like this. Also add the test cases 
for this. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136643219
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
--- End diff --

Update the comments to reflect the new logics. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136642064
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -495,15 +495,16 @@ private[hive] class HiveClientImpl(
 shim.dropTable(client, dbName, tableName, true, ignoreIfNotExists, 
purge)
   }
 
-  override def alterTable(tableName: String, table: CatalogTable): Unit = 
withHiveState {
+  override def alterTable(dbName: String, tableName: String, table: 
CatalogTable): Unit =
+  withHiveState {
--- End diff --

https://github.com/apache/spark/pull/19104  

Need to write a few test cases for it. It might take multiple passes for it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread rxin
Github user rxin commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136640563
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -495,15 +495,16 @@ private[hive] class HiveClientImpl(
 shim.dropTable(client, dbName, tableName, true, ignoreIfNotExists, 
purge)
   }
 
-  override def alterTable(tableName: String, table: CatalogTable): Unit = 
withHiveState {
+  override def alterTable(dbName: String, tableName: String, table: 
CatalogTable): Unit =
+  withHiveState {
--- End diff --

What's needed to change? Why can't the author works on it?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136623529
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -495,15 +495,16 @@ private[hive] class HiveClientImpl(
 shim.dropTable(client, dbName, tableName, true, ignoreIfNotExists, 
purge)
   }
 
-  override def alterTable(tableName: String, table: CatalogTable): Unit = 
withHiveState {
+  override def alterTable(dbName: String, tableName: String, table: 
CatalogTable): Unit =
+  withHiveState {
--- End diff --

Let me change the impl of HiveClientImpl. Then, you can work on it. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136620936
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -495,15 +495,16 @@ private[hive] class HiveClientImpl(
 shim.dropTable(client, dbName, tableName, true, ignoreIfNotExists, 
purge)
   }
 
-  override def alterTable(tableName: String, table: CatalogTable): Unit = 
withHiveState {
+  override def alterTable(dbName: String, tableName: String, table: 
CatalogTable): Unit =
+  withHiveState {
--- End diff --

This is not right. We should remove `tableName `. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136620584
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -495,15 +495,16 @@ private[hive] class HiveClientImpl(
 shim.dropTable(client, dbName, tableName, true, ignoreIfNotExists, 
purge)
   }
 
-  override def alterTable(tableName: String, table: CatalogTable): Unit = 
withHiveState {
+  override def alterTable(dbName: String, tableName: String, table: 
CatalogTable): Unit =
+  withHiveState {
--- End diff --

```Scala
override def alterTable(
dbName: String,
tableName: String,
table: CatalogTable): Unit = withHiveState {
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136619408
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(oldDb))
--- End diff --

If database is not given, Hive moves into the current database.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136610605
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClient.scala ---
@@ -90,10 +90,11 @@ private[hive] trait HiveClient {
   def dropTable(dbName: String, tableName: String, ignoreIfNotExists: 
Boolean, purge: Boolean): Unit
 
   /** Alter a table whose name matches the one specified in `table`, 
assuming it exists. */
-  final def alterTable(table: CatalogTable): Unit = 
alterTable(table.identifier.table, table)
+  final def alterTable(table: CatalogTable): Unit =
+alterTable(table.database, table.identifier.table, table)
 
-  /** Updates the given table with new metadata, optionally renaming the 
table. */
-  def alterTable(tableName: String, table: CatalogTable): Unit
+  /** Updates the give table in database with new metadata, optionally 
renaming the table. */
--- End diff --

The original `given` is correct.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-09-01 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136609981
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(oldDb))
--- End diff --

Hi, @jinxing64 . Is this correct? I guess that this should be `currentDb` 
instead of `oldDb`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-08-30 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136252314
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClient.scala ---
@@ -95,6 +95,9 @@ private[hive] trait HiveClient {
   /** Updates the given table with new metadata, optionally renaming the 
table. */
   def alterTable(tableName: String, table: CatalogTable): Unit
 
+  /** Updates the give table in database with new metadata, optionally 
renaming the table. */
+  def alterDbTable(dbName: String, tableName: String, table: 
CatalogTable): Unit
--- End diff --

Keep using `alterTable` instead of introducing a new one?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-08-30 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136252223
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -576,25 +576,25 @@ class SessionCatalog(
* This assumes the database specified in `newName` matches the one in 
`oldName`.
*/
   def renameTable(oldName: TableIdentifier, newName: TableIdentifier): 
Unit = synchronized {
-val db = formatDatabaseName(oldName.database.getOrElse(currentDb))
-newName.database.map(formatDatabaseName).foreach { newDb =>
-  if (db != newDb) {
-throw new AnalysisException(
-  s"RENAME TABLE source and destination databases do not match: 
'$db' != '$newDb'")
-  }
-}
+val oldDb = formatDatabaseName(oldName.database.getOrElse(currentDb))
+val newDb = formatDatabaseName(newName.database.getOrElse(oldDb))
 
 val oldTableName = formatTableName(oldName.table)
 val newTableName = formatTableName(newName.table)
-if (db == globalTempViewManager.database) {
+if (oldDb == globalTempViewManager.database) {
--- End diff --

`newDB` should not be `globalTempViewManager.database`. Temporary views or 
global temporary views should be inapplicable to move from one DB to another DB


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-08-30 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136252051
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/InMemoryCatalog.scala
 ---
@@ -264,21 +264,22 @@ class InMemoryCatalog(
 }
   }
 
-  override protected def doRenameTable(
-  db: String,
+  override protected def doRenameDbTable(
+  oldDb: String,
   oldName: String,
-  newName: String): Unit = synchronized {
-requireTableExists(db, oldName)
-requireTableNotExists(db, newName)
-val oldDesc = catalog(db).tables(oldName)
-oldDesc.table = oldDesc.table.copy(identifier = 
TableIdentifier(newName, Some(db)))
-
+  newDbOpt: Option[String],
--- End diff --

It should be cleaner if you change `Option[String]` to `String`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-08-30 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/19086#discussion_r136251887
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
 ---
@@ -131,13 +131,21 @@ abstract class ExternalCatalog
   ignoreIfNotExists: Boolean,
   purge: Boolean): Unit
 
-  final def renameTable(db: String, oldName: String, newName: String): 
Unit = {
-postToAll(RenameTablePreEvent(db, oldName, newName))
-doRenameTable(db, oldName, newName)
-postToAll(RenameTableEvent(db, oldName, newName))
+  final def renameDbTable(
--- End diff --

No need to change the name.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19086: [SPARK-21874][SQL] Support changing database when...

2017-08-30 Thread jinxing64
GitHub user jinxing64 opened a pull request:

https://github.com/apache/spark/pull/19086

[SPARK-21874][SQL] Support changing database when rename table.

## What changes were proposed in this pull request?

Support changing database of table by `alter table dbA.XXX rename to 
dbB.YYY`

## How was this patch tested?

Updated existing unit test.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jinxing64/spark SPARK-21874

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19086.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19086


commit 7865b3dbe5bc4cda263bc75f8c3524c7bb0fe81c
Author: jinxing 
Date:   2017-08-30T03:09:57Z

Support changing database when rename table.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org