[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-14 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/16373


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105831078
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

Okay, I'll update that later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105830656
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

Then, you just need to improve the function `getNormalizedResult` in 
SQLQueryTestSuite to mask it. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105827253
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

Yes it works, but it outputs the absolute path for `Location`, so the test 
suite will fail on another environment.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105826784
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

If we change `hiveResultString` to
```
case command @ ExecutedCommandExec(s: ShowTablesCommand) if 
!s.isExtended =>
  command.executeCollect().map(_.getString(1))
```

I did a try. It works. Below is the output.


```

-- !query 22
SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Ch', d=1)
-- !query 22 schema

struct
-- !query 22 output
showdb  show_t1 false   CatalogPartition(
Partition Values: [c=Ch, d=1]
Storage(Location: 
file:/Users/xiao/IdeaProjects/sparkDelivery/sql/core/spark-warehouse/showdb.db/show_t1/c=Ch/d=1)
Partition Parameters:{})
```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105820580
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
--- End diff --

For temporary views, they have empty database.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105819853
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

In `show-tables.sql`, we only output the value of the column `tableName`, 
we should verify the schema here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105746571
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala 
---
@@ -925,6 +925,26 @@ class DDLSuite extends QueryTest with SharedSQLContext 
with BeforeAndAfterEach {
 }
   }
 
+  test("show table extended ... partition") {
--- End diff --

Just a q. What is the reason why we are unable to do it in 
`show-tables.sql`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105745374
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse(db)
--- End diff --

For temporary views, `database` is empty. `val database = 
tableIdent.database.getOrElse("")`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105745236
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala ---
@@ -127,8 +127,8 @@ class QueryExecution(val sparkSession: SparkSession, 
val logical: LogicalPlan) {
 .map(s => String.format(s"%-20s", s))
 .mkString("\t")
   }
-// SHOW TABLES in Hive only output table names, while ours outputs 
database, table name, isTemp.
-case command: ExecutedCommandExec if 
command.cmd.isInstanceOf[ShowTablesCommand] =>
+// SHOW TABLES in Hive only output table names, while ours output 
database, table name, isTemp.
+case command @ ExecutedCommandExec(showTables: ShowTablesCommand) =>
--- End diff --

`case command @ ExecutedCommandExec(showTables: ShowTablesCommand)` ->
`case command @ ExecutedCommandExec(_: ShowTablesCommand)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-13 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105687000
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
+} else {
+  Row(database, tableName, isTemp)
+}
   }
+} else {
+  // Show the information of partitions.
--- End diff --

Yes - this follows the behavior that Hive does.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105587160
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
+} else {
+  Row(database, tableName, isTemp)
+}
   }
+} else {
+  // Show the information of partitions.
--- End diff --

when users specify the partition spec, we can only list one table?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105586710
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
--- End diff --

shall we get the current database?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r10556
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
+} else {
+  Row(database, tableName, isTemp)
+}
   }
+} else {
+  // Show the information of partitions.
+  //
+  // Note: tableIdentifierPattern should be non-empty, otherwise a 
[[ParseException]]
+  // should have been thrown by the sql parser.
+  val tableIdentifier = TableIdentifier(tableIdentifierPattern.get, 
Some(db))
+  val table = catalog.getTableMetadata(tableIdentifier).identifier
+  val partition = catalog.getPartition(tableIdentifier, 
partitionSpec.get)
+  val database = table.database.getOrElse("")
+  val tableName = table.table
+  val isTemp = catalog.isTemporaryTable(table)
+  val information = partition.toString
+  Seq(Row(database, tableName, isTemp, information))
--- End diff --

What is the reason this is different from the above case? That is, why not 
adding `\n`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105561082
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
+} else {
+  Row(database, tableName, isTemp)
+}
   }
+} else {
+  // Show the information of partitions.
+  //
+  // Note: tableIdentifierPattern should be non-empty, otherwise a 
[[ParseException]]
+  // should have been thrown by the sql parser.
+  val tableIdentifier = TableIdentifier(tableIdentifierPattern.get, 
Some(db))
--- End diff --

To be consistent, we can rename it to `tableIdent`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105561051
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -642,18 +644,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

Nit: -> `s"$information\n"`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105560904
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala ---
@@ -123,8 +123,12 @@ class QueryExecution(val sparkSession: SparkSession, 
val logical: LogicalPlan) {
 .mkString("\t")
   }
 // SHOW TABLES in Hive only output table names, while ours outputs 
database, table name, isTemp.
-case command: ExecutedCommandExec if 
command.cmd.isInstanceOf[ShowTablesCommand] =>
-  command.executeCollect().map(_.getString(1))
+case command@ ExecutedCommandExec(showTables: ShowTablesCommand) =>
+  if (showTables.isExtended) {
+command.executeCollect().map(_.getString(3))
--- End diff --

If we do not have it, any test case failed? 

This function is just for testing. If we do not have the related Hive 
output comparison. We can simplify it to 
```Scala
case command @ ExecutedCommandExec(showTables: ShowTablesCommand) if 
!showTables.isExtended =>
  command.executeCollect().map(_.getString(1))
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-03-12 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r105560834
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/QueryExecution.scala ---
@@ -123,8 +123,12 @@ class QueryExecution(val sparkSession: SparkSession, 
val logical: LogicalPlan) {
 .mkString("\t")
   }
 // SHOW TABLES in Hive only output table names, while ours outputs 
database, table name, isTemp.
-case command: ExecutedCommandExec if 
command.cmd.isInstanceOf[ShowTablesCommand] =>
-  command.executeCollect().map(_.getString(1))
+case command@ ExecutedCommandExec(showTables: ShowTablesCommand) =>
--- End diff --

Nit: `command@` -> `command @`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-02-08 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r100031009
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

'''
hive> create table t1(a int, b string) partitioned by (dt string, hour 
string);
hive> alter table t1 add partition (dt='2017-02-08', hour='17');
hive> alter table t1 add partition (dt='2017-02-08', hour='18');
hive> show table extended like 't1' partition(dt='2017-02-08');
FAILED: SemanticException [Error 10006]: Partition not found {dt=2017-02-08}
hive> show table extended like 't1' partition(dt='2017-02-08', hour='17');
OK
tableName:t1
owner:meituan
location:hdfs://localhost:9000/user/hive/warehouse/t1/dt=2017-02-08/hour=17
inputformat:org.apache.hadoop.mapred.TextInputFormat
outputformat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
columns:struct columns { i32 a, string b}
partitioned:true
partitionColumns:struct partition_columns { string dt, string hour}
totalNumberFiles:0
totalFileSize:0
maxFileSize:0
minFileSize:0
lastAccessTime:0
lastUpdateTime:1486548513945
'''


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-01-15 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r96150183
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

Could you post the output of Hive? I am thinking if our outputs look 
strange. Do we need to improve the function toString?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2017-01-15 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r96149748
  
--- Diff: sql/core/src/test/resources/sql-tests/results/show-tables.sql.out 
---
@@ -128,62 +128,108 @@ SHOW TABLE EXTENDED
 
 
 -- !query 13
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
+SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1)
 -- !query 13 schema
-struct<>

+struct
 -- !query 13 output
-org.apache.spark.sql.catalyst.parser.ParseException
-
-Operation not allowed: SHOW TABLE EXTENDED ... PARTITION(line 1, pos 0)
-
-== SQL ==
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
-^^^
+show_t1
--- End diff --

`hiveResultString` is only used for testing. I think we should fix it as 
long as it does not break any test case.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94018052
  
--- Diff: sql/core/src/test/resources/sql-tests/results/show-tables.sql.out 
---
@@ -128,62 +128,108 @@ SHOW TABLE EXTENDED
 
 
 -- !query 13
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
+SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1)
 -- !query 13 schema
-struct<>

+struct
 -- !query 13 output
-org.apache.spark.sql.catalyst.parser.ParseException
-
-Operation not allowed: SHOW TABLE EXTENDED ... PARTITION(line 1, pos 0)
-
-== SQL ==
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
-^^^
+show_t1
--- End diff --

The `SQLQueryTestSuite` runs the sql and convert the result as a hive 
compatible sequence of strings. It handles the `ShowTablesCommand` this way:
```
// SHOW TABLES in Hive only output table names, while ours outputs 
database, table name, isTemp.
case command: ExecutedCommandExec if 
command.cmd.isInstanceOf[ShowTablesCommand] =>
  command.executeCollect().map(_.getString(1))
```
We may either change this case, or we create a new command 
`ShowTableCommand` to handle the statement `SHOW TABLE EXTENDED LIKE ... 
[PARTITION]`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94014930
  
--- Diff: sql/core/src/test/resources/sql-tests/results/show-tables.sql.out 
---
@@ -128,62 +128,108 @@ SHOW TABLE EXTENDED
 
 
 -- !query 13
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
+SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1)
 -- !query 13 schema
-struct<>

+struct
 -- !query 13 output
-org.apache.spark.sql.catalyst.parser.ParseException
-
-Operation not allowed: SHOW TABLE EXTENDED ... PARTITION(line 1, pos 0)
-
-== SQL ==
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
-^^^
+show_t1
--- End diff --

That is weird, I think I should invest a few time on this problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94014794
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

The statement `SHOW TABLE EXTENDED ... PARTITION` requires the partition to 
be fully matched, so in this case, we output no more than one row for each 
command.
For example:
```
spark-sql> SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1);
db2 show_t1 false   CatalogPartition(
Partition Values: [c=Us, d=1]
Storage(Location: 
file:/Users/meituan/workspace/spark/spark-warehouse/db2.db/show_t1/c=Us/d=1, 
InputFormat: org.apache.hadoop.mapred.TextInputFormat, OutputFormat: 
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, Serde: 
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Properties: 
[serialization.format=1])
Partition Parameters:{transient_lastDdlTime=1482918646})
Time taken: 0.957 seconds, Fetched 1 row(s)
spark-sql> SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us');
Error in query: Partition spec is invalid. The spec (c) must match the 
partition spec (c, d) defined in table '`db2`.`show_t1`';
``` 

BTW, this behavior strictly follows that from HIVE.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94010013
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

This PR also faces the same issue, right? Can you post the outputs of the 
command?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94004875
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

I think only in this case, the output could contains multiple rows, and 
each row contains multiple lines.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94004693
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

Then, why we do not have the `\n` in the other similar cases?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94004561
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

We have to output a '\n' at the end of the whole row.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-28 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94004429
  
--- Diff: sql/core/src/test/resources/sql-tests/results/show-tables.sql.out 
---
@@ -128,62 +128,108 @@ SHOW TABLE EXTENDED
 
 
 -- !query 13
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
+SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1)
 -- !query 13 schema
-struct<>

+struct
 -- !query 13 output
-org.apache.spark.sql.catalyst.parser.ParseException
-
-Operation not allowed: SHOW TABLE EXTENDED ... PARTITION(line 1, pos 0)
-
-== SQL ==
-SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us')
-^^^
+show_t1
--- End diff --

This output is right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-27 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/16373#discussion_r94004059
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -619,18 +621,34 @@ case class ShowTablesCommand(
 // instead of calling tables in sparkSession.
 val catalog = sparkSession.sessionState.catalog
 val db = databaseName.getOrElse(catalog.getCurrentDatabase)
-val tables =
-  tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
-tables.map { tableIdent =>
-  val database = tableIdent.database.getOrElse("")
-  val tableName = tableIdent.table
-  val isTemp = catalog.isTemporaryTable(tableIdent)
-  if (isExtended) {
-val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
-Row(database, tableName, isTemp, s"${information}\n")
-  } else {
-Row(database, tableName, isTemp)
+if (partitionSpec.isEmpty) {
+  // Show the information of tables.
+  val tables =
+tableIdentifierPattern.map(catalog.listTables(db, 
_)).getOrElse(catalog.listTables(db))
+  tables.map { tableIdent =>
+val database = tableIdent.database.getOrElse("")
+val tableName = tableIdent.table
+val isTemp = catalog.isTemporaryTable(tableIdent)
+if (isExtended) {
+  val information = 
catalog.getTempViewOrPermanentTableMetadata(tableIdent).toString
+  Row(database, tableName, isTemp, s"${information}\n")
--- End diff --

Do you know why we need `s"${information}\n"`, instead of `information`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16373: [SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ....

2016-12-21 Thread jiangxb1987
GitHub user jiangxb1987 opened a pull request:

https://github.com/apache/spark/pull/16373

[SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ... PARTITION` statement

## What changes were proposed in this pull request?

We should support the statement `SHOW TABLE EXTENDED LIKE 
'table_identifier' PARTITION(partition_spec)`, just like that HIVE does.
When partition is specified, the `SHOW TABLE EXTENDED` command should 
output the information of the partitions instead of the tables.
Note that in this statement, we require exact matched partition spec. For 
example:
```
CREATE TABLE show_t1(a String, b Int) PARTITIONED BY (c String, d String);
ALTER TABLE show_t1 ADD PARTITION (c='Us', d=1) PARTITION (c='Us', d=22);

-- Output the extended information of Partition(c='Us', d=1)
SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us', d=1);
-- Throw an AnalysisException
SHOW TABLE EXTENDED LIKE 'show_t1' PARTITION(c='Us');
```

## How was this patch tested?
Add new test sqls in file `show-tables.sql`.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jiangxb1987/spark show-partition-extended

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/16373.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #16373


commit d5ce86a2ef7f9833eb0a2b3c6859a17a998cfd59
Author: jiangxingbo 
Date:   2016-12-21T08:16:17Z

support SHOW TABLE EXTENDED ... PARTITION statement.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org