[spark] branch branch-3.0 updated (2831c62 -> 190c57b)

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 2831c62  Revert "[SPARK-35321][SQL][3.0] Don't register Hive permanent 
functions when creating Hive client"
 add 190c57b  
[SPARK-34795][SPARK-35192][SPARK-35293][SPARK-35327][SQL][TESTS][3.0] Adds a 
new job in GitHub Actions to check the output of TPC-DS queries

No new revisions were added by this update.

Summary of changes:
 .github/workflows/build_and_test.yml   |   64 +
 .../resources/tpcds-query-results/v1_4/q1.sql.out  |  105 +
 .../resources/tpcds-query-results/v1_4/q10.sql.out |   11 +
 .../resources/tpcds-query-results/v1_4/q11.sql.out |   99 +
 .../resources/tpcds-query-results/v1_4/q12.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q13.sql.out |6 +
 .../tpcds-query-results/v1_4/q14a.sql.out  |  105 +
 .../tpcds-query-results/v1_4/q14b.sql.out  |  105 +
 .../resources/tpcds-query-results/v1_4/q15.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q16.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q17.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q18.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q19.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q2.sql.out  | 2518 +++
 .../resources/tpcds-query-results/v1_4/q20.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q21.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q22.sql.out |  105 +
 .../tpcds-query-results/v1_4/q23a.sql.out  |6 +
 .../tpcds-query-results/v1_4/q23b.sql.out  |6 +
 .../tpcds-query-results/v1_4/q24a.sql.out  |6 +
 .../tpcds-query-results/v1_4/q24b.sql.out  |6 +
 .../resources/tpcds-query-results/v1_4/q25.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q26.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q27.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q28.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q29.sql.out |7 +
 .../resources/tpcds-query-results/v1_4/q3.sql.out  |   88 +
 .../resources/tpcds-query-results/v1_4/q30.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q31.sql.out |   66 +
 .../resources/tpcds-query-results/v1_4/q32.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q33.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q34.sql.out |  223 ++
 .../resources/tpcds-query-results/v1_4/q35.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q36.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q37.sql.out |7 +
 .../resources/tpcds-query-results/v1_4/q38.sql.out |6 +
 .../tpcds-query-results/v1_4/q39a.sql.out  |  214 ++
 .../tpcds-query-results/v1_4/q39b.sql.out  |   15 +
 .../resources/tpcds-query-results/v1_4/q4.sql.out  |7 +
 .../resources/tpcds-query-results/v1_4/q40.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q41.sql.out |   10 +
 .../resources/tpcds-query-results/v1_4/q42.sql.out |   16 +
 .../resources/tpcds-query-results/v1_4/q43.sql.out |   11 +
 .../resources/tpcds-query-results/v1_4/q44.sql.out |   15 +
 .../resources/tpcds-query-results/v1_4/q45.sql.out |   25 +
 .../resources/tpcds-query-results/v1_4/q46.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q47.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q48.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q49.sql.out |   37 +
 .../resources/tpcds-query-results/v1_4/q5.sql.out  |  105 +
 .../resources/tpcds-query-results/v1_4/q50.sql.out |   11 +
 .../resources/tpcds-query-results/v1_4/q51.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q52.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q53.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q54.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q55.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q56.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q57.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q58.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q59.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q60.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q61.sql.out |6 +
 .../resources/tpcds-query-results/v1_4/q62.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q63.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q64.sql.out |   14 +
 .../resources/tpcds-query-results/v1_4/q65.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q66.sql.out |   10 +
 .../resources/tpcds-query-results/v1_4/q67.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q68.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q69.sql.out |  105 +
 .../resources/tpcds-query-results/v1_4/q7.sql.out  |  105 +
 .../resources/tpcds-query-results/v1_4/q70.sql.out |8 +
 .../resources/tpcds-query-results/v1_4/q71.sql.out | 1195 

[spark] branch master updated: [SPARK-35347][SQL] Use MethodUtils for looking up methods in Invoke and StaticInvoke

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 5b65d8a  [SPARK-35347][SQL] Use MethodUtils for looking up methods in 
Invoke and StaticInvoke
5b65d8a is described below

commit 5b65d8a129a63ac5c9ad482842901a1a0d1420ad
Author: Liang-Chi Hsieh 
AuthorDate: Sat May 8 15:17:30 2021 -0700

[SPARK-35347][SQL] Use MethodUtils for looking up methods in Invoke and 
StaticInvoke

### What changes were proposed in this pull request?

This patch proposes to use `MethodUtils` for looking up methods `Invoke` 
and `StaticInvoke` expressions.

### Why are the changes needed?

Currently we wrote our logic in `Invoke` and `StaticInvoke` expressions for 
looking up methods. It is tricky to consider all the cases and there is already 
existing utility package for this purpose. We should reuse the utility package.

### Does this PR introduce _any_ user-facing change?

No, internal change only.

### How was this patch tested?

Existing tests.

Closes #32474 from viirya/invoke-util.

Authored-by: Liang-Chi Hsieh 
Signed-off-by: Dongjoon Hyun 
---
 .../sql/catalyst/expressions/objects/objects.scala | 31 +-
 1 file changed, 7 insertions(+), 24 deletions(-)

diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
index 5d79774..a967dc4 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
@@ -24,6 +24,8 @@ import scala.collection.mutable.{Builder, WrappedArray}
 import scala.reflect.ClassTag
 import scala.util.{Properties, Try}
 
+import org.apache.commons.lang3.reflect.MethodUtils
+
 import org.apache.spark.{SparkConf, SparkEnv}
 import org.apache.spark.serializer._
 import org.apache.spark.sql.Row
@@ -147,30 +149,11 @@ trait InvokeLike extends Expression with NonSQLExpression 
{
   }
 
   final def findMethod(cls: Class[_], functionName: String, argClasses: 
Seq[Class[_]]): Method = {
-// Looking with function name + argument classes first.
-try {
-  cls.getMethod(functionName, argClasses: _*)
-} catch {
-  case _: NoSuchMethodException =>
-// For some cases, e.g. arg class is Object, `getMethod` cannot find 
the method.
-// We look at function name + argument length
-val m = cls.getMethods.filter { m =>
-  m.getName == functionName && m.getParameterCount == arguments.length
-}
-if (m.isEmpty) {
-  sys.error(s"Couldn't find $functionName on $cls")
-} else if (m.length > 1) {
-  // More than one matched method signature. Exclude synthetic one, 
e.g. generic one.
-  val realMethods = m.filter(!_.isSynthetic)
-  if (realMethods.length > 1) {
-// Ambiguous case, we don't know which method to choose, just fail 
it.
-sys.error(s"Found ${realMethods.length} $functionName on $cls")
-  } else {
-realMethods.head
-  }
-} else {
-  m.head
-}
+val method = MethodUtils.getMatchingAccessibleMethod(cls, functionName, 
argClasses: _*)
+if (method == null) {
+  sys.error(s"Couldn't find $functionName on $cls")
+} else {
+  method
 }
   }
 }

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: Revert "[SPARK-35321][SQL][3.0] Don't register Hive permanent functions when creating Hive client"

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 2831c62  Revert "[SPARK-35321][SQL][3.0] Don't register Hive permanent 
functions when creating Hive client"
2831c62 is described below

commit 2831c621b5e472af6daf1d520439ae96989d62a9
Author: Dongjoon Hyun 
AuthorDate: Sat May 8 13:02:08 2021 -0700

Revert "[SPARK-35321][SQL][3.0] Don't register Hive permanent functions 
when creating Hive client"

This reverts commit 5268a38878921d47d51cc04ab1863aed250bf06e.
---
 .../org/apache/spark/sql/hive/client/HiveClientImpl.scala |  4 ++--
 .../scala/org/apache/spark/sql/hive/client/HiveShim.scala | 11 ---
 2 files changed, 2 insertions(+), 13 deletions(-)

diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
index 6b28066..f311836 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
@@ -258,7 +258,7 @@ private[hive] class HiveClientImpl(
 if (clientLoader.cachedHive != null) {
   clientLoader.cachedHive.asInstanceOf[Hive]
 } else {
-  val c = shim.getHive(conf)
+  val c = Hive.get(conf)
   clientLoader.cachedHive = c
   c
 }
@@ -286,7 +286,7 @@ private[hive] class HiveClientImpl(
 // Set the thread local metastore client to the client associated with 
this HiveClientImpl.
 Hive.set(client)
 // Replace conf in the thread local Hive with current conf
-shim.getHive(conf)
+Hive.get(conf)
 // setCurrentSessionState will use the classLoader associated
 // with the HiveConf in `state` to override the context class loader of 
the current
 // thread.
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
index 0714f3c..d11bf94 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
@@ -173,8 +173,6 @@ private[client] sealed abstract class Shim {
 
   def getMSC(hive: Hive): IMetaStoreClient
 
-  def getHive(hiveConf: HiveConf): Hive
-
   protected def findMethod(klass: Class[_], name: String, args: Class[_]*): 
Method = {
 klass.getMethod(name, args: _*)
   }
@@ -197,8 +195,6 @@ private[client] class Shim_v0_12 extends Shim with Logging {
 getMSCMethod.invoke(hive).asInstanceOf[IMetaStoreClient]
   }
 
-  override def getHive(hiveConf: HiveConf): Hive = Hive.get(hiveConf)
-
   private lazy val startMethod =
 findStaticMethod(
   classOf[SessionState],
@@ -1237,13 +1233,6 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
   override def alterPartitions(hive: Hive, tableName: String, newParts: 
JList[Partition]): Unit = {
 alterPartitionsMethod.invoke(hive, tableName, newParts, 
environmentContextInAlterTable)
   }
-
-  // HIVE-10319 introduced a new HMS thrift API `get_all_functions` which is 
used by
-  // `Hive.get` since version 2.1.0, when it loads all Hive permanent 
functions during
-  // initialization. This breaks compatibility with HMS server of lower 
versions.
-  // To mitigate here we use `Hive.getWithFastCheck` instead which skips 
loading the permanent
-  // functions and therefore avoids calling `get_all_functions`.
-  override def getHive(hiveConf: HiveConf): Hive = 
Hive.getWithFastCheck(hiveConf, false)
 }
 
 private[client] class Shim_v2_2 extends Shim_v2_1

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.1 updated: Revert "[SPARK-35321][SQL][3.1] Don't register Hive permanent functions when creating Hive client"

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.1 by this push:
 new 53a93fc  Revert "[SPARK-35321][SQL][3.1] Don't register Hive permanent 
functions when creating Hive client"
53a93fc is described below

commit 53a93fc04402f0ff7eb64371d869c8f1bb177c25
Author: Dongjoon Hyun 
AuthorDate: Sat May 8 13:01:45 2021 -0700

Revert "[SPARK-35321][SQL][3.1] Don't register Hive permanent functions 
when creating Hive client"

This reverts commit 6fbea6a38ddd0c95d54a71c850e0d901727ed842.
---
 .../org/apache/spark/sql/hive/client/HiveClientImpl.scala |  4 ++--
 .../scala/org/apache/spark/sql/hive/client/HiveShim.scala | 11 ---
 2 files changed, 2 insertions(+), 13 deletions(-)

diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
index e9ab3af..0d45af2 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
@@ -254,7 +254,7 @@ private[hive] class HiveClientImpl(
 if (clientLoader.cachedHive != null) {
   clientLoader.cachedHive.asInstanceOf[Hive]
 } else {
-  val c = shim.getHive(conf)
+  val c = Hive.get(conf)
   clientLoader.cachedHive = c
   c
 }
@@ -282,7 +282,7 @@ private[hive] class HiveClientImpl(
 // Set the thread local metastore client to the client associated with 
this HiveClientImpl.
 Hive.set(client)
 // Replace conf in the thread local Hive with current conf
-shim.getHive(conf)
+Hive.get(conf)
 // setCurrentSessionState will use the classLoader associated
 // with the HiveConf in `state` to override the context class loader of 
the current
 // thread.
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
index 8d0f3e8..db67480 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
@@ -177,8 +177,6 @@ private[client] sealed abstract class Shim {
 
   def getMSC(hive: Hive): IMetaStoreClient
 
-  def getHive(hiveConf: HiveConf): Hive
-
   protected def findMethod(klass: Class[_], name: String, args: Class[_]*): 
Method = {
 klass.getMethod(name, args: _*)
   }
@@ -201,8 +199,6 @@ private[client] class Shim_v0_12 extends Shim with Logging {
 getMSCMethod.invoke(hive).asInstanceOf[IMetaStoreClient]
   }
 
-  override def getHive(hiveConf: HiveConf): Hive = Hive.get(hiveConf)
-
   private lazy val startMethod =
 findStaticMethod(
   classOf[SessionState],
@@ -1293,13 +1289,6 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
   override def alterPartitions(hive: Hive, tableName: String, newParts: 
JList[Partition]): Unit = {
 alterPartitionsMethod.invoke(hive, tableName, newParts, 
environmentContextInAlterTable)
   }
-
-  // HIVE-10319 introduced a new HMS thrift API `get_all_functions` which is 
used by
-  // `Hive.get` since version 2.1.0, when it loads all Hive permanent 
functions during
-  // initialization. This breaks compatibility with HMS server of lower 
versions.
-  // To mitigate here we use `Hive.getWithFastCheck` instead which skips 
loading the permanent
-  // functions and therefore avoids calling `get_all_functions`.
-  override def getHive(hiveConf: HiveConf): Hive = 
Hive.getWithFastCheck(hiveConf, false)
 }
 
 private[client] class Shim_v2_2 extends Shim_v2_1

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: Revert "[SPARK-35321][SQL] Don't register Hive permanent functions when creating Hive client"

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new e31bef1  Revert "[SPARK-35321][SQL] Don't register Hive permanent 
functions when creating Hive client"
e31bef1 is described below

commit e31bef1ed4744de9e83bac1887cbfaad7597d78f
Author: Dongjoon Hyun 
AuthorDate: Sat May 8 13:01:17 2021 -0700

Revert "[SPARK-35321][SQL] Don't register Hive permanent functions when 
creating Hive client"

This reverts commit b4ec9e230484db88c6220c27e43e3db11f3bdeef.
---
 .../org/apache/spark/sql/hive/client/HiveClientImpl.scala |  4 ++--
 .../scala/org/apache/spark/sql/hive/client/HiveShim.scala | 11 ---
 2 files changed, 2 insertions(+), 13 deletions(-)

diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
index 9bb3f45..e9728b8 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
@@ -273,7 +273,7 @@ private[hive] class HiveClientImpl(
 if (clientLoader.cachedHive != null) {
   clientLoader.cachedHive.asInstanceOf[Hive]
 } else {
-  val c = shim.getHive(conf)
+  val c = Hive.get(conf)
   clientLoader.cachedHive = c
   c
 }
@@ -303,7 +303,7 @@ private[hive] class HiveClientImpl(
 // with the side-effect of Hive.get(conf) to avoid using out-of-date 
HiveConf.
 // See discussion in 
https://github.com/apache/spark/pull/16826/files#r104606859
 // for more details.
-shim.getHive(conf)
+Hive.get(conf)
 // setCurrentSessionState will use the classLoader associated
 // with the HiveConf in `state` to override the context class loader of 
the current
 // thread.
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
index b0a877d..2f7fe96 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
@@ -177,8 +177,6 @@ private[client] sealed abstract class Shim {
 
   def getMSC(hive: Hive): IMetaStoreClient
 
-  def getHive(hiveConf: HiveConf): Hive
-
   protected def findMethod(klass: Class[_], name: String, args: Class[_]*): 
Method = {
 klass.getMethod(name, args: _*)
   }
@@ -201,8 +199,6 @@ private[client] class Shim_v0_12 extends Shim with Logging {
 getMSCMethod.invoke(hive).asInstanceOf[IMetaStoreClient]
   }
 
-  override def getHive(hiveConf: HiveConf): Hive = Hive.get(hiveConf)
-
   private lazy val startMethod =
 findStaticMethod(
   classOf[SessionState],
@@ -1320,13 +1316,6 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
   override def alterPartitions(hive: Hive, tableName: String, newParts: 
JList[Partition]): Unit = {
 alterPartitionsMethod.invoke(hive, tableName, newParts, 
environmentContextInAlterTable)
   }
-
-  // HIVE-10319 introduced a new HMS thrift API `get_all_functions` which is 
used by
-  // `Hive.get` since version 2.1.0, when it loads all Hive permanent 
functions during
-  // initialization. This breaks compatibility with HMS server of lower 
versions.
-  // To mitigate here we use `Hive.getWithFastCheck` instead which skips 
loading the permanent
-  // functions and therefore avoids calling `get_all_functions`.
-  override def getHive(hiveConf: HiveConf): Hive = 
Hive.getWithFastCheck(hiveConf, false)
 }
 
 private[client] class Shim_v2_2 extends Shim_v2_1

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r47590 - in /dev/spark/v2.4.8-rc4-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2021-05-08 Thread viirya
Author: viirya
Date: Sat May  8 15:12:23 2021
New Revision: 47590

Log:
Apache Spark v2.4.8-rc4 docs


[This commit notification would consist of 1461 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r47588 - /dev/spark/v2.4.8-rc4-bin/

2021-05-08 Thread viirya
Author: viirya
Date: Sat May  8 13:45:32 2021
New Revision: 47588

Log:
Apache Spark v2.4.8-rc4

Added:
dev/spark/v2.4.8-rc4-bin/
dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz   (with props)
dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.asc
dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.sha512
dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz   (with props)
dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.asc
dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.sha512
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.6.tgz.asc
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.6.tgz.sha512
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.7.tgz.asc
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-hadoop2.7.tgz.sha512
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop-scala-2.12.tgz   
(with props)
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop-scala-2.12.tgz.asc

dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop-scala-2.12.tgz.sha512
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop.tgz   (with props)
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop.tgz.asc
dev/spark/v2.4.8-rc4-bin/spark-2.4.8-bin-without-hadoop.tgz.sha512
dev/spark/v2.4.8-rc4-bin/spark-2.4.8.tgz   (with props)
dev/spark/v2.4.8-rc4-bin/spark-2.4.8.tgz.asc
dev/spark/v2.4.8-rc4-bin/spark-2.4.8.tgz.sha512

Added: dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.asc
==
--- dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.asc (added)
+++ dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.asc Sat May  8 13:45:32 2021
@@ -0,0 +1,14 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQGcBAABAgAGBQJglm3LAAoJEGU8IwH+pJPudY0L/A1MwKT1tFU05GZb4h30qC+h
+lny0Bw2u4RbzemUW8u2Na+iEOWqa0tgl29S46BWr+welKh9wyy4cTBxU3KZWz7Dr
+H7RlSJ0mw2aAKYm2Yq0ZexCknkgWOD0eIgo6DqORybzC3iUtqtmzrqlGz5xyHp6a
+VPZEvZjXCxd1jv6k47eSEQXltApbY3FvPM2VUXYh8ZUiFXNkEapwwd2Uo8kT7m3f
+em22BDo8IwhI4p807ZnwGf69QmsgN22bzUJ2gY0Nvwour6nHxkewPzCGZynAyWYc
+0WcX68yXlUuBkg1qU3Lc2nH+vKFfxqkwIcPIn49/Ekc7nfBgzgUgSaVacvxAGysI
+a7yEA9y0BktYuyh6mUy6Lc1KyJyLU93VlOh0TZbRro/tJ+08PIkuIgHsCPT/wGfH
+ewE631xaW/r95rdPO8b4THPM9kPlVZWFYQO1yLxEoPF1m5KL1V8GOdHjeOUfVJT1
+jrhzu/j9ewa2SokthYsM0gXMp8WpQ6d/6Lgi6tVU8Q==
+=IG0g
+-END PGP SIGNATURE-

Added: dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.sha512
==
--- dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.sha512 (added)
+++ dev/spark/v2.4.8-rc4-bin/SparkR_2.4.8.tar.gz.sha512 Sat May  8 13:45:32 2021
@@ -0,0 +1,3 @@
+SparkR_2.4.8.tar.gz: 82FA1C4F F3775594 54AF8C65 4F42DFFC 31E96D7B 893DBF5E
+ 93FBDB04 F57E4CD9 5F057DBC BAA090E8 AD304C4D 87F087B5
+ D5271AD3 4E6B73C7 6F266E43 36CB6F77

Added: dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.asc
==
--- dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.asc (added)
+++ dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.asc Sat May  8 13:45:32 2021
@@ -0,0 +1,14 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQGcBAABAgAGBQJglliFAAoJEGU8IwH+pJPuF64MAJN18U8WU3SJJ6OKfqYGYEy5
+IJgzIrGgNkGEQe1l9IESa56Ftf/LMgwwJ88D2eTZDKKR5ctb8uINTNLDkitz9d5a
+hlswTNt2PikHPFPmLcvG/OYQIBoKNIwJeOXFbC+E+PUWUxS1RjgnoFCzGvU9ms2X
+a9mfcMY5QMO7Togp8sKhasoiwKOkhWl98ynZmpK4Bbza1SBJTGJnlWDZVbL5tyIa
+bHAw2/YQc/JOD7qlH98m24u4m+pQLREMIX1TyO8ajpbX2T7wlc9SDCfnPneCPd7p
+gHLXugH1IFXbMiC6qMPili0MLfqNVMfOdYO87gQ44frk1drN7krzl3rQ6u7nAmiI
+If3MlmsTLksdL5spmfqgvg8R7ouTfNpWoMHMvR4KRiYDiW7VpBkC6FpYL67F/w49
+13eZkSw9R5dNM1YsVe6e6vTbEuO9q6prj7EAmzUkHNFzdjAVxhy4QQt3UfIYgvVz
+OVSu86FKDkgCyDsHSOqD1rUV3427u9pyuDSEeKwLpQ==
+=08+o
+-END PGP SIGNATURE-

Added: dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.sha512
==
--- dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.sha512 (added)
+++ dev/spark/v2.4.8-rc4-bin/pyspark-2.4.8.tar.gz.sha512 Sat May  8 13:45:32 
2021
@@ -0,0 +1,3 @@
+pyspark-2.4.8.tar.gz: 537274DB DAB1CAE7 F04F4FC5 

[spark] branch master updated (b025780 -> 06c4009)

2021-05-08 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from b025780  [SPARK-35331][SQL] Support resolving missing attrs for 
distribute/cluster by/repartition hint
 add 06c4009  [SPARK-35327][SQL][TESTS] Filters out the TPC-DS queries that 
can cause flaky test results

No new revisions were added by this update.

Summary of changes:
 .../resources/tpcds-query-results/v1_4/q6.sql.out  |  51 --
 .../resources/tpcds-query-results/v1_4/q75.sql.out | 105 -
 .../scala/org/apache/spark/sql/TPCDSBase.scala |   2 +-
 .../org/apache/spark/sql/TPCDSQueryTestSuite.scala |   6 ++
 4 files changed, 7 insertions(+), 157 deletions(-)
 delete mode 100644 
sql/core/src/test/resources/tpcds-query-results/v1_4/q6.sql.out
 delete mode 100644 
sql/core/src/test/resources/tpcds-query-results/v1_4/q75.sql.out

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (323a6e8 -> b025780)

2021-05-08 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 323a6e8  [SPARK-35232][SQL] Nested column pruning should retain column 
metadata
 add b025780  [SPARK-35331][SQL] Support resolving missing attrs for 
distribute/cluster by/repartition hint

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/sql/catalyst/analysis/Analyzer.scala|  9 +
 .../src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala  | 12 
 2 files changed, 21 insertions(+)

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org