This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 8efe367  [SPARK-30756][SQL] Fix `ThriftServerWithSparkContextSuite` on 
spark-branch-3.0-test-sbt-hadoop-2.7-hive-2.3
8efe367 is described below

commit 8efe367a4ee862b8a85aee8881b0335b34cbba70
Author: HyukjinKwon <[email protected]>
AuthorDate: Tue Feb 11 15:50:03 2020 +0900

    [SPARK-30756][SQL] Fix `ThriftServerWithSparkContextSuite` on 
spark-branch-3.0-test-sbt-hadoop-2.7-hive-2.3
    
    ### What changes were proposed in this pull request?
    
    This PR tries #26710 (comment) way to fix the test.
    
    ### Why are the changes needed?
    
    To make the tests pass.
    
    ### Does this PR introduce any user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Jenkins will test first, and then `on 
spark-branch-3.0-test-sbt-hadoop-2.7-hive-2.3` will test it out.
    
    Closes #27513 from HyukjinKwon/test-SPARK-30756.
    
    Authored-by: HyukjinKwon <[email protected]>
    Signed-off-by: HyukjinKwon <[email protected]>
---
 project/SparkBuild.scala | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 9d0af3a..1c5c36e 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -478,7 +478,8 @@ object SparkParallelTestGrouping {
     "org.apache.spark.sql.hive.thriftserver.ThriftServerQueryTestSuite",
     "org.apache.spark.sql.hive.thriftserver.SparkSQLEnvSuite",
     "org.apache.spark.sql.hive.thriftserver.ui.ThriftServerPageSuite",
-    "org.apache.spark.sql.hive.thriftserver.ui.HiveThriftServer2ListenerSuite"
+    "org.apache.spark.sql.hive.thriftserver.ui.HiveThriftServer2ListenerSuite",
+    "org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextSuite"
   )
 
   private val DEFAULT_TEST_GROUP = "default_test_group"


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to