[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/16638 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99988384 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala --- @@ -812,150 +812,234 @@ class HiveDDLSuite } } - test("CREATE TABLE LIKE a temporary view") { -val sourceViewName = "tab1" -val targetTabName = "tab2" -withTempView(sourceViewName) { - withTable(targetTabName) { -spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) - .createTempView(sourceViewName) -sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName") - -val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( - TableIdentifier(sourceViewName)) -val targetTable = spark.sessionState.catalog.getTableMetadata( - TableIdentifier(targetTabName, Some("default"))) - -checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a temporary view [LOCATION]...") { +var createdTableType = "MANAGED" +for ( i <- 0 to 1 ) { --- End diff -- creating a method and wrap this piece of code can also reuse the code. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user ouyangxiaochen commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99984076 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala --- @@ -812,150 +812,234 @@ class HiveDDLSuite } } - test("CREATE TABLE LIKE a temporary view") { -val sourceViewName = "tab1" -val targetTabName = "tab2" -withTempView(sourceViewName) { - withTable(targetTabName) { -spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) - .createTempView(sourceViewName) -sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName") - -val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( - TableIdentifier(sourceViewName)) -val targetTable = spark.sessionState.catalog.getTableMetadata( - TableIdentifier(targetTabName, Some("default"))) - -checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a temporary view [LOCATION]...") { +var createdTableType = "MANAGED" +for ( i <- 0 to 1 ) { --- End diff -- I write this for the purpose of reusing this piece of public code, because the basic logic of these two scenarios are almost the same. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99979122 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala --- @@ -812,150 +812,234 @@ class HiveDDLSuite } } - test("CREATE TABLE LIKE a temporary view") { -val sourceViewName = "tab1" -val targetTabName = "tab2" -withTempView(sourceViewName) { - withTable(targetTabName) { -spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) - .createTempView(sourceViewName) -sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName") - -val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( - TableIdentifier(sourceViewName)) -val targetTable = spark.sessionState.catalog.getTableMetadata( - TableIdentifier(targetTabName, Some("default"))) - -checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a temporary view [LOCATION]...") { +var createdTableType = "MANAGED" +for ( i <- 0 to 1 ) { + withTempDir {tmpDir => +val sourceViewName = "tab1" +val targetTabName = "tab2" +val basePath = tmpDir.toURI +withTempView(sourceViewName) { + withTable(targetTabName) { +spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) + .createTempView(sourceViewName) +if (i == 0) { + sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName ") +} else { + createdTableType = "EXTERNAL" + sql(s"CREATE TABLE $targetTabName " + +s"LIKE $sourceViewName LOCATION '$basePath'") +} + +val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( + TableIdentifier(sourceViewName)) +val targetTable = spark.sessionState.catalog.getTableMetadata( + TableIdentifier(targetTabName, Some("default"))) + +checkCreateTableLike(sourceTable, targetTable, createdTableType) + } +} } } } - test("CREATE TABLE LIKE a data source table") { -val sourceTabName = "tab1" -val targetTabName = "tab2" -withTable(sourceTabName, targetTabName) { - spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) -.write.format("json").saveAsTable(sourceTabName) - sql(s"CREATE TABLE $targetTabName LIKE $sourceTabName") - - val sourceTable = - spark.sessionState.catalog.getTableMetadata(TableIdentifier(sourceTabName, Some("default"))) - val targetTable = - spark.sessionState.catalog.getTableMetadata(TableIdentifier(targetTabName, Some("default"))) - // The table type of the source table should be a Hive-managed data source table - assert(DDLUtils.isDatasourceTable(sourceTable)) - assert(sourceTable.tableType == CatalogTableType.MANAGED) - - checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a data source table [LOCATION]...") { +var createdTableType = "MANAGED" +for ( i <- 0 to 1 ) { + withTempDir { tmpDir => +val sourceTabName = "tab1" +val targetTabName = "tab2" +val basePath = tmpDir.toURI +withTable(sourceTabName, targetTabName) { + spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) +.write.format("json").saveAsTable(sourceTabName) + if ( i == 0 ) { +sql(s"CREATE TABLE $targetTabName LIKE $sourceTabName") + } else { +createdTableType = "EXTERNAL" +sql(s"CREATE TABLE $targetTabName LIKE $sourceTabName LOCATION '$basePath'") + } + + val sourceTable = +spark.sessionState.catalog.getTableMetadata( + TableIdentifier(sourceTabName, Some("default"))) + val targetTable = +spark.sessionState.catalog.getTableMetadata( + TableIdentifier(targetTabName, Some("default"))) + // The table type of the source table should be a Hive-managed data source table + assert(DDLUtils.isDatasourceTable(sourceTable)) + assert(sourceTable.tableType == CatalogTableType.MANAGED) + + checkCreateTableLike(sourceTable, targetTable, createdTableType) +} + } } } - test("CREATE TABLE LIKE an external data source table") { -val sourceTabName = "tab1" -val targetTabName = "tab2" -withTable(sourceTabName,
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99979027 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala --- @@ -812,150 +812,234 @@ class HiveDDLSuite } } - test("CREATE TABLE LIKE a temporary view") { -val sourceViewName = "tab1" -val targetTabName = "tab2" -withTempView(sourceViewName) { - withTable(targetTabName) { -spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) - .createTempView(sourceViewName) -sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName") - -val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( - TableIdentifier(sourceViewName)) -val targetTable = spark.sessionState.catalog.getTableMetadata( - TableIdentifier(targetTabName, Some("default"))) - -checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a temporary view [LOCATION]...") { +var createdTableType = "MANAGED" +for ( i <- 0 to 1 ) { --- End diff -- you can create a method with parameter `location: Option[String]`, instead of writing a for loop with 2 iterations... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99978898 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala --- @@ -812,150 +812,234 @@ class HiveDDLSuite } } - test("CREATE TABLE LIKE a temporary view") { -val sourceViewName = "tab1" -val targetTabName = "tab2" -withTempView(sourceViewName) { - withTable(targetTabName) { -spark.range(10).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd) - .createTempView(sourceViewName) -sql(s"CREATE TABLE $targetTabName LIKE $sourceViewName") - -val sourceTable = spark.sessionState.catalog.getTempViewOrPermanentTableMetadata( - TableIdentifier(sourceViewName)) -val targetTable = spark.sessionState.catalog.getTableMetadata( - TableIdentifier(targetTabName, Some("default"))) - -checkCreateTableLike(sourceTable, targetTable) + test("CREATE TABLE LIKE a temporary view [LOCATION]...") { --- End diff -- actually we don't need to change the test name --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99978824 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveDDLCommandSuite.scala --- @@ -528,8 +528,8 @@ class HiveDDLCommandSuite extends PlanTest with SQLTestUtils with TestHiveSingle assert(source.table == "table2") val v2 = "CREATE TABLE IF NOT EXISTS table1 LIKE table2" --- End diff -- add one more test case to check CREATE TABLE LIKE with location --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99978757 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveDDLCommandSuite.scala --- @@ -518,8 +518,8 @@ class HiveDDLCommandSuite extends PlanTest with SQLTestUtils with TestHiveSingle test("create table like") { val v1 = "CREATE TABLE table1 LIKE table2" -val (target, source, exists) = parser.parsePlan(v1).collect { - case CreateTableLikeCommand(t, s, allowExisting) => (t, s, allowExisting) +val (target, source, location, exists) = parser.parsePlan(v1).collect { --- End diff -- add an assert to check `location` is empty --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r99978630 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala --- @@ -51,13 +51,14 @@ import org.apache.spark.util.Utils * * The syntax of using this command in SQL is: * {{{ - * CREATE TABLE [IF NOT EXISTS] [db_name.]table_name - * LIKE [other_db_name.]existing_table_name + * CREATE [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name --- End diff -- no `EXTERNAL` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97471913 --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 --- @@ -81,8 +81,8 @@ statement rowFormat? createFileFormat? locationSpec? (TBLPROPERTIES tablePropertyList)? (AS? query)? #createHiveTable -| CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier -LIKE source=tableIdentifier #createTableLike +| CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier --- End diff -- ok then let's simplify the logic: if `location` is specified, we create an external table internally. Else, create managed table. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user gatorsmile commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97471178 --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 --- @@ -81,8 +81,8 @@ statement rowFormat? createFileFormat? locationSpec? (TBLPROPERTIES tablePropertyList)? (AS? query)? #createHiveTable -| CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier -LIKE source=tableIdentifier #createTableLike +| CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier --- End diff -- I am fine --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97469240 --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 --- @@ -81,8 +81,8 @@ statement rowFormat? createFileFormat? locationSpec? (TBLPROPERTIES tablePropertyList)? (AS? query)? #createHiveTable -| CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier -LIKE source=tableIdentifier #createTableLike +| CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier --- End diff -- since Spark 2.2, we wanna hide the manage/external concept from users. It looks reasonable to add a `LOCATION` statement in `CREATE TABLE LIKE`, but do we really need the `EXTERNAL` keyword? We don't need to be exactly same with hive. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user ouyangxiaochen commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97454893 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala --- @@ -1140,14 +1140,18 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder { * * For example: * {{{ - * CREATE TABLE [IF NOT EXISTS] [db_name.]table_name - * LIKE [other_db_name.]existing_table_name + * CREATE [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name + * LIKE [other_db_name.]existing_table_name [locationSpec] * }}} */ override def visitCreateTableLike(ctx: CreateTableLikeContext): LogicalPlan = withOrigin(ctx) { val targetTable = visitTableIdentifier(ctx.target) val sourceTable = visitTableIdentifier(ctx.source) -CreateTableLikeCommand(targetTable, sourceTable, ctx.EXISTS != null) +val location = Option(ctx.locationSpec).map(visitLocationSpec) +if (ctx.EXTERNAL != null && location.isEmpty) { --- End diff -- OK, I'll do it later, Thanks! --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user gatorsmile commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97446544 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala --- @@ -1140,14 +1140,18 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder { * * For example: * {{{ - * CREATE TABLE [IF NOT EXISTS] [db_name.]table_name - * LIKE [other_db_name.]existing_table_name + * CREATE [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name + * LIKE [other_db_name.]existing_table_name [locationSpec] * }}} */ override def visitCreateTableLike(ctx: CreateTableLikeContext): LogicalPlan = withOrigin(ctx) { val targetTable = visitTableIdentifier(ctx.target) val sourceTable = visitTableIdentifier(ctx.source) -CreateTableLikeCommand(targetTable, sourceTable, ctx.EXISTS != null) +val location = Option(ctx.locationSpec).map(visitLocationSpec) +if (ctx.EXTERNAL != null && location.isEmpty) { + operationNotAllowed("CREATE EXTERNAL TABLE LIKE must be accompanied by LOCATION", ctx) +} --- End diff -- To the other reviewers, we are following what we did in `visitCreateHiveTable` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: [SPARK-19115] [SQL] Supporting Create External Ta...
Github user gatorsmile commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r97446466 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala --- @@ -1140,14 +1140,18 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder { * * For example: * {{{ - * CREATE TABLE [IF NOT EXISTS] [db_name.]table_name - * LIKE [other_db_name.]existing_table_name + * CREATE [EXTERNAL] TABLE [IF NOT EXISTS] [db_name.]table_name + * LIKE [other_db_name.]existing_table_name [locationSpec] * }}} */ override def visitCreateTableLike(ctx: CreateTableLikeContext): LogicalPlan = withOrigin(ctx) { val targetTable = visitTableIdentifier(ctx.target) val sourceTable = visitTableIdentifier(ctx.source) -CreateTableLikeCommand(targetTable, sourceTable, ctx.EXISTS != null) +val location = Option(ctx.locationSpec).map(visitLocationSpec) +if (ctx.EXTERNAL != null && location.isEmpty) { --- End diff -- Add a comment here: ``` // If we are creating an EXTERNAL table, then the LOCATION field is required ``` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: spark-19115
Github user ouyangxiaochen commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r96993195 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala --- @@ -58,6 +58,7 @@ import org.apache.spark.util.Utils case class CreateTableLikeCommand( --- End diff -- ok,i will update it later,Thanks! --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: spark-19115
Github user gatorsmile commented on a diff in the pull request: https://github.com/apache/spark/pull/16638#discussion_r96809883 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala --- @@ -58,6 +58,7 @@ import org.apache.spark.util.Utils case class CreateTableLikeCommand( --- End diff -- Please update the comment of this class. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #16638: spark-19115
GitHub user ouyangxiaochen opened a pull request: https://github.com/apache/spark/pull/16638 spark-19115 ## What changes were proposed in this pull request? sparksql supports the command : create external table if not exists gen_tbl like src_tbl location '/warehouse/gen_tbl' in spark2.X ## How was this patch tested? manual tests You can merge this pull request into a Git repository by running: $ git pull https://github.com/ouyangxiaochen/spark spark19115 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/16638.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #16638 commit adde008588cf8e05cf261c086201c27a8dd5584f Author: ouyangxiaochenDate: 2017-01-19T03:15:17Z spark-19115 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org