[spark] branch branch-3.0 updated (baa2e6d -> 6e338c3)

2021-02-27 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git.


from baa2e6d  Revert "[SPARK-34543][SQL] Respect the 
`spark.sql.caseSensitive` config while resolving partition spec in v1 `SET 
LOCATION`"
 add 6e338c3  [SPARK-34543][SQL][3.0] Respect the `spark.sql.caseSensitive` 
config while resolving partition spec in v1 `SET LOCATION`

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/spark/sql/execution/command/ddl.scala|  7 ++-
 .../org/apache/spark/sql/execution/command/DDLSuite.scala | 11 +++
 2 files changed, 17 insertions(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (397b843 -> 54c053a)

2021-02-27 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 397b843  [SPARK-34415][ML] Randomization in hyperparameter optimization
 add 54c053a  [SPARK-34479][SQL] Add zstandard codec to Avro compression 
codec list

No new revisions were added by this update.

Summary of changes:
 .../avro/src/main/scala/org/apache/spark/sql/avro/AvroUtils.scala| 4 ++--
 .../avro/src/test/scala/org/apache/spark/sql/avro/AvroSuite.scala| 5 +
 .../src/main/scala/org/apache/spark/sql/internal/SQLConf.scala   | 4 ++--
 3 files changed, 9 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1aeafb4 -> 397b843)

2021-02-27 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1aeafb4  [SPARK-34559][BUILD] Upgrade to ZSTD JNI 1.4.8-6
 add 397b843  [SPARK-34415][ML] Randomization in hyperparameter optimization

No new revisions were added by this update.

Summary of changes:
 docs/ml-tuning.md  |  36 -
 ...elSelectionViaRandomHyperparametersExample.java |  83 ++
 ...lSelectionViaRandomHyperparametersExample.scala |  79 ++
 .../spark/ml/tuning/ParamRandomBuilder.scala   | 160 
 .../spark/ml/tuning/ParamRandomBuilderSuite.scala  | 123 +++
 .../apache/spark/ml/tuning/RandomRangesSuite.scala | 168 +
 python/docs/source/reference/pyspark.ml.rst|   1 +
 python/pyspark/ml/tests/test_tuning.py | 106 -
 python/pyspark/ml/tuning.py|  48 +-
 python/pyspark/ml/tuning.pyi   |   5 +
 10 files changed, 806 insertions(+), 3 deletions(-)
 create mode 100644 
examples/src/main/java/org/apache/spark/examples/ml/JavaModelSelectionViaRandomHyperparametersExample.java
 create mode 100644 
examples/src/main/scala/org/apache/spark/examples/ml/ModelSelectionViaRandomHyperparametersExample.scala
 create mode 100644 
mllib/src/main/scala/org/apache/spark/ml/tuning/ParamRandomBuilder.scala
 create mode 100644 
mllib/src/test/scala/org/apache/spark/ml/tuning/ParamRandomBuilderSuite.scala
 create mode 100644 
mllib/src/test/scala/org/apache/spark/ml/tuning/RandomRangesSuite.scala


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.1 updated: [SPARK-34392][SQL] Support ZoneOffset +h:mm in DateTimeUtils. getZoneId

2021-02-27 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.1 by this push:
 new daeae50  [SPARK-34392][SQL] Support ZoneOffset +h:mm in DateTimeUtils. 
getZoneId
daeae50 is described below

commit daeae5095a6202bfc7afa19cafde6c4b86a3613c
Author: ShiKai Wang 
AuthorDate: Fri Feb 26 11:03:20 2021 -0600

[SPARK-34392][SQL] Support ZoneOffset +h:mm in DateTimeUtils. getZoneId

### What changes were proposed in this pull request?
To support +8:00 in Spark3 when execute sql
`select to_utc_timestamp("2020-02-07 16:00:00", "GMT+8:00")`

### Why are the changes needed?
+8:00 this format is supported in PostgreSQL,hive, presto, but not 
supported in Spark3
https://issues.apache.org/jira/browse/SPARK-34392

### Does this PR introduce _any_ user-facing change?
no

### How was this patch tested?
unit test

Closes #31624 from Karl-WangSK/zone.

Lead-authored-by: ShiKai Wang 
Co-authored-by: Karl-WangSK 
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/sql/catalyst/util/DateTimeUtils.scala  |  5 -
 .../apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala | 13 +
 .../scala/org/apache/spark/sql/internal/SQLConfSuite.scala  |  5 ++---
 3 files changed, 19 insertions(+), 4 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
index 87cf3c9..89cb67c 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
@@ -50,7 +50,10 @@ object DateTimeUtils {
 
   val TIMEZONE_OPTION = "timeZone"
 
-  def getZoneId(timeZoneId: String): ZoneId = ZoneId.of(timeZoneId, 
ZoneId.SHORT_IDS)
+  def getZoneId(timeZoneId: String): ZoneId = {
+// To support the (+|-)h:mm format because it was supported before Spark 
3.0.
+ZoneId.of(timeZoneId.replaceFirst("(\\+|\\-)(\\d):", "$10$2:"), 
ZoneId.SHORT_IDS)
+  }
   def getTimeZone(timeZoneId: String): TimeZone = 
TimeZone.getTimeZone(getZoneId(timeZoneId))
 
   /**
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala
index 3d841f3..fb2d511 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala
@@ -471,6 +471,13 @@ class DateTimeUtilsSuite extends SparkFunSuite with 
Matchers with SQLHelper {
 test("2011-12-25 09:00:00.123456", JST.getId, "2011-12-25 
18:00:00.123456")
 test("2011-12-25 09:00:00.123456", LA.getId, "2011-12-25 
01:00:00.123456")
 test("2011-12-25 09:00:00.123456", "Asia/Shanghai", "2011-12-25 
17:00:00.123456")
+test("2011-12-25 09:00:00.123456", "-7", "2011-12-25 02:00:00.123456")
+test("2011-12-25 09:00:00.123456", "+8:00", "2011-12-25 
17:00:00.123456")
+test("2011-12-25 09:00:00.123456", "+8:00:00", "2011-12-25 
17:00:00.123456")
+test("2011-12-25 09:00:00.123456", "+0800", "2011-12-25 
17:00:00.123456")
+test("2011-12-25 09:00:00.123456", "-071020", "2011-12-25 
01:49:40.123456")
+test("2011-12-25 09:00:00.123456", "-07:10:20", "2011-12-25 
01:49:40.123456")
+
   }
 }
 
@@ -496,6 +503,12 @@ class DateTimeUtilsSuite extends SparkFunSuite with 
Matchers with SQLHelper {
 test("2011-12-25 18:00:00.123456", JST.getId, "2011-12-25 
09:00:00.123456")
 test("2011-12-25 01:00:00.123456", LA.getId, "2011-12-25 
09:00:00.123456")
 test("2011-12-25 17:00:00.123456", "Asia/Shanghai", "2011-12-25 
09:00:00.123456")
+test("2011-12-25 02:00:00.123456", "-7", "2011-12-25 09:00:00.123456")
+test("2011-12-25 17:00:00.123456", "+8:00", "2011-12-25 
09:00:00.123456")
+test("2011-12-25 17:00:00.123456", "+8:00:00", "2011-12-25 
09:00:00.123456")
+test("2011-12-25 17:00:00.123456", "+0800", "2011-12-25 
09:00:00.123456")
+test("2011-12-25 01:49:40.123456", "-071020", "2011-12-25 
09:00:00.123456")
+test("2011-12-25 01:49:40.123456", "-07:10:20", "2011-12-25 
09:00:00.123456")
   }
 }
 
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala 
b/sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala
index 1ea2d4f..f5d1dc2 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala
@@ -414,13 +414,12 @@ class SQLConfSuite extends QueryTest with 
SharedSparkSession {

[GitHub] [spark-website] srowen closed pull request #318: Fix a sbt example for generating dependency graphs

2021-02-27 Thread GitBox


srowen closed pull request #318:
URL: https://github.com/apache/spark-website/pull/318


   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark-website] branch asf-site updated: Fix a sbt example for generating dependency graphs

2021-02-27 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 0732fba  Fix a sbt example for generating dependency graphs
0732fba is described below

commit 0732fbafce4c92449c759a880d8b0885e243d5f3
Author: Takeshi Yamamuro 
AuthorDate: Sat Feb 27 07:23:49 2021 -0600

Fix a sbt example for generating dependency graphs

This PR intends to fix an error below;
```
$./build/sbt dependency-tree
[error] Not a valid command: dependency-tree
[error] Not a valid project ID: dependency-tree
[error] Expected ':'
[error] Not a valid key: dependency-tree (similar: dependencyTree, 
dependencyOverrides, sbtDependency)
[error] dependency-tree
[error]^
```

Author: Takeshi Yamamuro 

Closes #318 from maropu/DepTree.
---
 developer-tools.md| 2 +-
 site/developer-tools.html | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/developer-tools.md b/developer-tools.md
index bf13ed3..3b929ac 100644
--- a/developer-tools.md
+++ b/developer-tools.md
@@ -374,7 +374,7 @@ $ git checkout origin/pr/112 -b new-branch
 
 ```
 $ # sbt
-$ build/sbt dependency-tree
+$ build/sbt dependencyTree
  
 $ # Maven
 $ build/mvn -DskipTests install
diff --git a/site/developer-tools.html b/site/developer-tools.html
index 8b38f34..2f0d3bb 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -547,7 +547,7 @@ $ git checkout origin/pr/112 -b new-branch
 Generating Dependency Graphs
 
 $ # sbt
-$ build/sbt dependency-tree
+$ build/sbt dependencyTree
  
 $ # Maven
 $ build/mvn -DskipTests install


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[GitHub] [spark-website] maropu commented on pull request #318: Fix a sbt example for generating dependency graphs

2021-02-27 Thread GitBox


maropu commented on pull request #318:
URL: https://github.com/apache/spark-website/pull/318#issuecomment-787069582


   cc: @srowen 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[GitHub] [spark-website] maropu opened a new pull request #318: Fix a sbt example for generating dependency graphs

2021-02-27 Thread GitBox


maropu opened a new pull request #318:
URL: https://github.com/apache/spark-website/pull/318


   This PR intends to fix an error below;
   ```
   $./build/sbt dependency-tree
   [error] Not a valid command: dependency-tree
   [error] Not a valid project ID: dependency-tree
   [error] Expected ':'
   [error] Not a valid key: dependency-tree (similar: dependencyTree, 
dependencyOverrides, sbtDependency)
   [error] dependency-tree
   [error]^
   ```
   



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d758210 -> 1aeafb4)

2021-02-27 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d758210  [SPARK-34557][BUILD] Exclude Avro's transitive zstd-jni 
dependency
 add 1aeafb4  [SPARK-34559][BUILD] Upgrade to ZSTD JNI 1.4.8-6

No new revisions were added by this update.

Summary of changes:
 dev/deps/spark-deps-hadoop-2.7-hive-2.3 | 2 +-
 dev/deps/spark-deps-hadoop-3.2-hive-2.3 | 2 +-
 pom.xml | 2 +-
 3 files changed, 3 insertions(+), 3 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org