[GitHub] spark issue #15195: [SPARK-17632][SQL]make console sink and other sinks work...

2016-09-26 Thread chuanlei
Github user chuanlei commented on the issue:

https://github.com/apache/spark/pull/15195
  
@marmbrus 
Could you have a look at this pr?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80419919
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -420,17 +424,40 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 val catalog = sparkSession.sessionState.catalog
 
 if (catalog.isTemporaryTable(table)) {
+  if (partitionSpec.nonEmpty) {
+throw new AnalysisException(
+  s"DESC PARTITION is not allowed on a temporary view: 
${table.identifier}")
+  }
   describeSchema(catalog.lookupRelation(table).schema, result)
 } else {
   val metadata = catalog.getTableMetadata(table)
   describeSchema(metadata.schema, result)
 
-  if (isExtended) {
-describeExtended(metadata, result)
-  } else if (isFormatted) {
-describeFormatted(metadata, result)
+  describePartitionInfo(metadata, result)
+
+  if (partitionSpec.isEmpty) {
+if (isExtended) {
+  describeExtendedTableInfo(metadata, result)
+} else if (isFormatted) {
+  describeFormattedTableInfo(metadata, result)
+  describeStorageInfo(metadata, result)
+}
   } else {
-describePartitionInfo(metadata, result)
+if (metadata.tableType == CatalogTableType.VIEW) {
+  throw new AnalysisException(
+s"DESC PARTITION is not allowed on a view: 
${table.identifier}")
+}
+if (DDLUtils.isDatasourceTable(metadata)) {
+  throw new AnalysisException(
+s"DESC PARTITION is not allowed on a datasource table: 
${table.identifier}")
+}
+val partition = catalog.getPartition(table, partitionSpec)
+if (isExtended) {
+  describeExtendedDetailPartitionInfo(table, metadata, partition, 
result)
+} else if (isFormatted) {
+  describeFormattedDetailPartitionInfo(table, metadata, partition, 
result)
+  describeStorageInfo(metadata, result)
+}
--- End diff --

Yep. I'll refactor them into another function.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on the issue:

https://github.com/apache/spark/pull/15168
  
LGTM except two minor comments and pending tests. cc @hvanhovell 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80419521
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/SQLQuerySuite.scala 
---
@@ -341,6 +341,74 @@ class SQLQuerySuite extends QueryTest with 
SQLTestUtils with TestHiveSingleton {
 }
   }
 
+  test("describe partition") {
+withTable("partitioned_table", "datasource_table") {
+  sql("CREATE TABLE partitioned_table (a STRING, b INT) PARTITIONED BY 
(c STRING, d STRING)")
+  sql("ALTER TABLE partitioned_table ADD PARTITION (c='Us', d=1)")
+
+  checkKeywordsExist(sql("DESC partitioned_table PARTITION (c='Us', 
d=1)"),
+"# Partition Information",
+"# col_name")
+
+  checkKeywordsExist(sql("DESC EXTENDED partitioned_table PARTITION 
(c='Us', d=1)"),
+"# Partition Information",
+"# col_name",
+"Detailed Partition Information CatalogPartition(",
+"Partition Values: [Us, 1]",
+"Storage(Location:",
+"Partition Parameters")
+
+  checkKeywordsExist(sql("DESC FORMATTED partitioned_table PARTITION 
(c='Us', d=1)"),
+"# Partition Information",
+"# col_name",
+"# Detailed Partition Information",
+"Partition Value:",
+"Database:",
+"Table:",
+"Location:",
+"Partition Parameters:",
+"# Storage Information")
+
+  val m = intercept[NoSuchPartitionException] {
+sql("DESC partitioned_table PARTITION (c='Us', d=2)")
+  }.getMessage()
+  assert(m.contains("Partition not found in table"))
+
+  val m2 = intercept[AnalysisException] {
+sql("DESC partitioned_table PARTITION (c='Us')")
+  }.getMessage()
+  assert(m2.contains("Partition spec is invalid"))
+
+  val m3 = intercept[ParseException] {
+sql("DESC partitioned_table PARTITION (c='Us', d)")
+  }.getMessage()
+  assert(m3.contains("Unsupported SQL statement"))
+
+  spark
+.range(1).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd).write
+.partitionBy("d")
+.saveAsTable("datasource_table")
+  val m4 = intercept[AnalysisException] {
+sql("DESC datasource_table PARTITION (d=2)")
+  }.getMessage()
+  assert(m4.contains("DESC PARTITION is not allowed on a datasource 
table"))
+
+  val m5 = intercept[AnalysisException] {
+spark.range(10).select('id as 'a, 'id as 
'b).createTempView("view1")
+sql("DESC view1 PARTITION (c='Us', d=1)")
+  }.getMessage()
+  assert(m5.contains("DESC PARTITION is not allowed on a temporary 
view"))
+
+  withView("permanent_view") {
+val m = intercept[AnalysisException] {
+  sql("CREATE VIEW permanent_view AS SELECT * FROM 
partitioned_table")
+  sql("DESC permanent_view PARTITION (c='Us', d=1)")
+}.getMessage()
+assert(m.contains("DESC PARTITION is not allowed on a view"))
+  }
+}
+  }
--- End diff --

No problem!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80419524
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -420,17 +424,40 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 val catalog = sparkSession.sessionState.catalog
 
 if (catalog.isTemporaryTable(table)) {
+  if (partitionSpec.nonEmpty) {
+throw new AnalysisException(
+  s"DESC PARTITION is not allowed on a temporary view: 
${table.identifier}")
+  }
   describeSchema(catalog.lookupRelation(table).schema, result)
 } else {
   val metadata = catalog.getTableMetadata(table)
   describeSchema(metadata.schema, result)
 
-  if (isExtended) {
-describeExtended(metadata, result)
-  } else if (isFormatted) {
-describeFormatted(metadata, result)
+  describePartitionInfo(metadata, result)
+
+  if (partitionSpec.isEmpty) {
+if (isExtended) {
+  describeExtendedTableInfo(metadata, result)
+} else if (isFormatted) {
+  describeFormattedTableInfo(metadata, result)
+  describeStorageInfo(metadata, result)
+}
   } else {
-describePartitionInfo(metadata, result)
+if (metadata.tableType == CatalogTableType.VIEW) {
+  throw new AnalysisException(
+s"DESC PARTITION is not allowed on a view: 
${table.identifier}")
+}
+if (DDLUtils.isDatasourceTable(metadata)) {
+  throw new AnalysisException(
+s"DESC PARTITION is not allowed on a datasource table: 
${table.identifier}")
+}
+val partition = catalog.getPartition(table, partitionSpec)
+if (isExtended) {
+  describeExtendedDetailPartitionInfo(table, metadata, partition, 
result)
+} else if (isFormatted) {
+  describeFormattedDetailPartitionInfo(table, metadata, partition, 
result)
+  describeStorageInfo(metadata, result)
+}
--- End diff --

Create another function for LOC 446 to LOC 460? Compared with the existing 
one, the new implementation looks a little bit cumbersome



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80418906
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/SQLQuerySuite.scala 
---
@@ -341,6 +341,74 @@ class SQLQuerySuite extends QueryTest with 
SQLTestUtils with TestHiveSingleton {
 }
   }
 
+  test("describe partition") {
+withTable("partitioned_table", "datasource_table") {
+  sql("CREATE TABLE partitioned_table (a STRING, b INT) PARTITIONED BY 
(c STRING, d STRING)")
+  sql("ALTER TABLE partitioned_table ADD PARTITION (c='Us', d=1)")
+
+  checkKeywordsExist(sql("DESC partitioned_table PARTITION (c='Us', 
d=1)"),
+"# Partition Information",
+"# col_name")
+
+  checkKeywordsExist(sql("DESC EXTENDED partitioned_table PARTITION 
(c='Us', d=1)"),
+"# Partition Information",
+"# col_name",
+"Detailed Partition Information CatalogPartition(",
+"Partition Values: [Us, 1]",
+"Storage(Location:",
+"Partition Parameters")
+
+  checkKeywordsExist(sql("DESC FORMATTED partitioned_table PARTITION 
(c='Us', d=1)"),
+"# Partition Information",
+"# col_name",
+"# Detailed Partition Information",
+"Partition Value:",
+"Database:",
+"Table:",
+"Location:",
+"Partition Parameters:",
+"# Storage Information")
+
+  val m = intercept[NoSuchPartitionException] {
+sql("DESC partitioned_table PARTITION (c='Us', d=2)")
+  }.getMessage()
+  assert(m.contains("Partition not found in table"))
+
+  val m2 = intercept[AnalysisException] {
+sql("DESC partitioned_table PARTITION (c='Us')")
+  }.getMessage()
+  assert(m2.contains("Partition spec is invalid"))
+
+  val m3 = intercept[ParseException] {
+sql("DESC partitioned_table PARTITION (c='Us', d)")
+  }.getMessage()
+  assert(m3.contains("Unsupported SQL statement"))
+
+  spark
+.range(1).select('id as 'a, 'id as 'b, 'id as 'c, 'id as 'd).write
+.partitionBy("d")
+.saveAsTable("datasource_table")
+  val m4 = intercept[AnalysisException] {
+sql("DESC datasource_table PARTITION (d=2)")
+  }.getMessage()
+  assert(m4.contains("DESC PARTITION is not allowed on a datasource 
table"))
+
+  val m5 = intercept[AnalysisException] {
+spark.range(10).select('id as 'a, 'id as 
'b).createTempView("view1")
+sql("DESC view1 PARTITION (c='Us', d=1)")
+  }.getMessage()
+  assert(m5.contains("DESC PARTITION is not allowed on a temporary 
view"))
+
+  withView("permanent_view") {
+val m = intercept[AnalysisException] {
+  sql("CREATE VIEW permanent_view AS SELECT * FROM 
partitioned_table")
+  sql("DESC permanent_view PARTITION (c='Us', d=1)")
+}.getMessage()
+assert(m.contains("DESC PARTITION is not allowed on a view"))
+  }
+}
+  }
--- End diff --

Could you split the test case to two? One is covering the positive cases; 
another is covering the negative cases. We normally do not like a large test 
case.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80417904
  
--- Diff: R/pkg/R/utils.R ---
@@ -698,6 +698,21 @@ isSparkRShell <- function() {
   grepl(".*shell\\.R$", Sys.getenv("R_PROFILE_USER"), perl = TRUE)
 }
 
+captureJVMException <- function(e) {
+  stacktrace <- as.character(e)
+  if (any(grep("java.lang.IllegalArgumentException: ", stacktrace))) {
--- End diff --

are there cases where the IllegalArgument should be checked on the R side 
first to avoid the exception in the first place?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80417016
  
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2624,6 +2624,15 @@ setMethod("except",
 setMethod("write.df",
   signature(df = "SparkDataFrame"),
   function(df, path = NULL, source = NULL, mode = "error", ...) {
+if (!is.character(path) && !is.null(path)) {
+  stop("path should be charactor, null or omitted.")
--- End diff --

"character"?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80418015
  
--- Diff: R/pkg/R/utils.R ---
@@ -698,6 +698,21 @@ isSparkRShell <- function() {
   grepl(".*shell\\.R$", Sys.getenv("R_PROFILE_USER"), perl = TRUE)
 }
 
+captureJVMException <- function(e) {
--- End diff --

It'll be great to add some tests that would trigger tryCatch and this 
function?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80417594
  
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2624,6 +2624,15 @@ setMethod("except",
 setMethod("write.df",
   signature(df = "SparkDataFrame"),
   function(df, path = NULL, source = NULL, mode = "error", ...) {
+if (!is.character(path) && !is.null(path)) {
+  stop("path should be charactor, null or omitted.")
--- End diff --

minor point: is it more efficient to flip the checks, ie.
`if (!is.null(path) && !is.character(path))`, since path defaults to NULL?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80417225
  
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2624,6 +2624,15 @@ setMethod("except",
 setMethod("write.df",
   signature(df = "SparkDataFrame"),
   function(df, path = NULL, source = NULL, mode = "error", ...) {
+if (!is.character(path) && !is.null(path)) {
+  stop("path should be charactor, null or omitted.")
+}
+if (!is.character(source) && !is.null(source)) {
+  stop("source should be charactor, null or omitted. It is 
'parquet' by default.")
--- End diff --

strictly speaking, it's `spark.sql.sources.default` property, and when it 
is not set, then it is `parquet`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15231: [SPARK-17658][SPARKR] read.df/write.df API taking...

2016-09-26 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/15231#discussion_r80417020
  
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2624,6 +2624,15 @@ setMethod("except",
 setMethod("write.df",
   signature(df = "SparkDataFrame"),
   function(df, path = NULL, source = NULL, mode = "error", ...) {
+if (!is.character(path) && !is.null(path)) {
+  stop("path should be charactor, null or omitted.")
--- End diff --

same below


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15107: [SPARK-17551][SQL] complete the NULL ordering sup...

2016-09-26 Thread xwu0226
Github user xwu0226 closed the pull request at:

https://github.com/apache/spark/pull/15107


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15230: [SPARK-17657] [SQL] Disallow Users to Change Table Type

2016-09-26 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/15230
  
**[Test build #65900 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65900/consoleFull)**
 for PR 15230 at commit 
[`93cd988`](https://github.com/apache/spark/commit/93cd988a5a4ce492f9a9c8028701e986c80aaf59).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15232: [SPARK-17499][SPARKR][FOLLOWUP] Check null first for lay...

2016-09-26 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue:

https://github.com/apache/spark/pull/15232
  
Hm. Actually, I was wondering if it really works and I am still looking 
here and there.

My `testthat` version in my local is `v.1.0.2`. but it seems not working as 
below:

```r
> packageVersion("testthat")
[1] ‘1.0.2’
> library(testthat)
> test_that('warning bug', expect_error({options(warn = 2); 
warning('toto')}) )
Error: Test failed: 'warning bug'
* {
...
} did not throw an error.
```

If it is really fixed, I think this should not fail. hm.. am I missing 
something?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15232: [SPARK-17499][SPARKR][FOLLOWUP] Check null first for lay...

2016-09-26 Thread felixcheung
Github user felixcheung commented on the issue:

https://github.com/apache/spark/pull/15232
  
According to this fix 
https://github.com/hadley/testthat/commit/64036395cf0f2556cdd181dddb219e498f4be370
this is fixed in testthat v1.0.2? I recall we are running 1.1.0 in Jenkins?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/15168
  
**[Test build #65899 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65899/consoleFull)**
 for PR 15168 at commit 
[`d171bf6`](https://github.com/apache/spark/commit/d171bf64beb3c91add03454400c24570fc714b51).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15211: [SPARK-14709][ML] [WIP] spark.ml API for linear SVM

2016-09-26 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/15211
  
**[Test build #65898 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65898/consoleFull)**
 for PR 15211 at commit 
[`73b8011`](https://github.com/apache/spark/commit/73b8011ff9edef4fc96fdf68951b990f2fa340f6).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #13599: [SPARK-13587] [PYSPARK] Support virtualenv in pyspark

2016-09-26 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/13599
  
**[Test build #65897 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65897/consoleFull)**
 for PR 13599 at commit 
[`7db55fa`](https://github.com/apache/spark/commit/7db55faa70ebbd157e9ef0ee678bd972c2b5c32a).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on the issue:

https://github.com/apache/spark/pull/15168
  
I see. I'll prevent that from those persistent views, too.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #11119: [SPARK-10780][ML] Add an initial model to kmeans

2016-09-26 Thread dbtsai
Github user dbtsai commented on the issue:

https://github.com/apache/spark/pull/9
  
Ping @yinxusen on update. Would like to have it merged soon so we can work 
on LiR and LoR parts. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on the issue:

https://github.com/apache/spark/pull/15168
  
I think we do not need an extra metastore call if we already know we do not 
support partitions over views. You can check all the other DDL implementation. 
We did the same thing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80413543
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -499,6 +516,35 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 }
   }
 
+  private def describeExtendedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "Detailed Partition Information " + partition.toString, 
"", "")
+  }
+
+  private def describeFormattedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "# Detailed Partition Information", "", "")
+append(buffer, "Partition Value:", 
s"[${partition.spec.values.mkString(", ")}]", "")
+append(buffer, "Database:", table.database, "")
+append(buffer, "Table:", tableIdentifier.table, "")
+append(buffer, "Create Time:", "UNKNOWN", "")
+append(buffer, "Last Access Time:", "UNKNOWN", "")
+append(buffer, "Protect Mode:", "None", "")
--- End diff --

Yep. I agree! Let's save the bits.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15148: [SPARK-5992][ML] Locality Sensitive Hashing

2016-09-26 Thread Yunni
Github user Yunni commented on the issue:

https://github.com/apache/spark/pull/15148
  
Thanks @karlhigley All of your comments are very helpful. I made some 
changes to make it work. :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80413281
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -499,6 +516,35 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 }
   }
 
+  private def describeExtendedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "Detailed Partition Information " + partition.toString, 
"", "")
+  }
+
+  private def describeFormattedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "# Detailed Partition Information", "", "")
+append(buffer, "Partition Value:", 
s"[${partition.spec.values.mkString(", ")}]", "")
+append(buffer, "Database:", table.database, "")
+append(buffer, "Table:", tableIdentifier.table, "")
+append(buffer, "Create Time:", "UNKNOWN", "")
+append(buffer, "Last Access Time:", "UNKNOWN", "")
+append(buffer, "Protect Mode:", "None", "")
--- End diff --

IMO, if it does not offer any useful info to the external users, we should 
remove it and keep outputs concise. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80412806
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -499,6 +516,35 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 }
   }
 
+  private def describeExtendedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "Detailed Partition Information " + partition.toString, 
"", "")
+  }
+
+  private def describeFormattedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "# Detailed Partition Information", "", "")
+append(buffer, "Partition Value:", 
s"[${partition.spec.values.mkString(", ")}]", "")
+append(buffer, "Database:", table.database, "")
+append(buffer, "Table:", tableIdentifier.table, "")
+append(buffer, "Create Time:", "UNKNOWN", "")
+append(buffer, "Last Access Time:", "UNKNOWN", "")
+append(buffer, "Protect Mode:", "None", "")
--- End diff --

Yep, right. But, it's a place holder to look similar with the existing 
behavior. Of course, the column name is changed consistently with Spark 2.0, 
e.g. `CreateTime` -> `Create Time`.
Should we remove them clearly?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTIT...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/15168#discussion_r80412638
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala ---
@@ -499,6 +516,35 @@ case class DescribeTableCommand(table: 
TableIdentifier, isExtended: Boolean, isF
 }
   }
 
+  private def describeExtendedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "Detailed Partition Information " + partition.toString, 
"", "")
+  }
+
+  private def describeFormattedDetailPartitionInfo(
+  tableIdentifier: TableIdentifier,
+  table: CatalogTable,
+  partition: CatalogTablePartition,
+  buffer: ArrayBuffer[Row]): Unit = {
+append(buffer, "", "", "")
+append(buffer, "# Detailed Partition Information", "", "")
+append(buffer, "Partition Value:", 
s"[${partition.spec.values.mkString(", ")}]", "")
+append(buffer, "Database:", table.database, "")
+append(buffer, "Table:", tableIdentifier.table, "")
+append(buffer, "Create Time:", "UNKNOWN", "")
+append(buffer, "Last Access Time:", "UNKNOWN", "")
+append(buffer, "Protect Mode:", "None", "")
--- End diff --

We do not need to output these fields, if their values are always `UNKNOWN` 
or `None`. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on the issue:

https://github.com/apache/spark/pull/15168
  
For example, like the followings.
```
scala> sql("desc view1 partition (c='Us')").show
org.apache.spark.sql.AnalysisException: Partition spec is invalid. The spec 
(c) must match the partition spec () defined in table '`default`.`view1`';
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on the issue:

https://github.com/apache/spark/pull/15168
  
If we are talking about hive views (for non-partitioned persistent view), 
there is no partition info because it looks like a normal table. So, my PR 
handles that equally with normal tables.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on the issue:

https://github.com/apache/spark/pull/15168
  
We do not need to support partitioned views, I think. Your PR is unable to 
handle it without a code change in `HiveClientImpl`.

For non-partitioned persistent view, you should also issue an exception, 
right? 

Generally, I think we just need to issue an exception if the table is a 
view, no matter whether it is a temporary or persistent.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread dongjoon-hyun
Github user dongjoon-hyun commented on the issue:

https://github.com/apache/spark/pull/15168
  
Do you mean PartitionedView as you mentioned at #15233 ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15168: [SPARK-17612][SQL] Support `DESCRIBE table PARTITION` SQ...

2016-09-26 Thread gatorsmile
Github user gatorsmile commented on the issue:

https://github.com/apache/spark/pull/15168
  
How about the persistent view? I think we do not support it too, right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #14897: [SPARK-17338][SQL] add global temp view

2016-09-26 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/14897
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #14897: [SPARK-17338][SQL] add global temp view

2016-09-26 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/14897
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/65896/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #14897: [SPARK-17338][SQL] add global temp view

2016-09-26 Thread SparkQA
Github user SparkQA commented on the issue:

https://github.com/apache/spark/pull/14897
  
**[Test build #65896 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/65896/consoleFull)**
 for PR 14897 at commit 
[`67e459a`](https://github.com/apache/spark/commit/67e459a48c82ef2b13ffedbc23f3921db0721204).
 * This patch **fails MiMa tests**.
 * This patch merges cleanly.
 * This patch adds the following public classes _(experimental)_:
  * `class GlobalTempViewManager(val database: String) `
  * `case class CreateTempViewUsing(`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15224: [SPARK-17650] malformed url's throw exceptions be...

2016-09-26 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/15224


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



<    1   2   3   4   5   6