This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
     new aa53fcad4fc [SPARK-39856][SQL][TESTS][FOLLOW-UP] Increase the number 
of partitions in TPC-DS build to avoid out-of-memory
aa53fcad4fc is described below

commit aa53fcad4fc1622c18b14d22fc909f7b349f7931
Author: yangjie01 <[email protected]>
AuthorDate: Mon Jul 25 22:08:11 2022 +0900

    [SPARK-39856][SQL][TESTS][FOLLOW-UP] Increase the number of partitions in 
TPC-DS build to avoid out-of-memory
    
    ### What changes were proposed in this pull request?
    
    This PR increases the number of partitions further more (see also 
https://github.com/apache/spark/pull/37270)
    
    ### Why are the changes needed?
    
    To make the build pass.
    
    ### Does this PR introduce _any_ user-facing change?
    No, test and dev-only.
    
    ### How was this patch tested?
    
    It's tested in 
https://github.com/LuciferYang/spark/runs/7497163716?check_suite_focus=true
    
    Closes #37273 from LuciferYang/SPARK-39856-FOLLOWUP.
    
    Authored-by: yangjie01 <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 sql/core/src/test/scala/org/apache/spark/sql/TPCDSQueryTestSuite.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/TPCDSQueryTestSuite.scala 
b/sql/core/src/test/scala/org/apache/spark/sql/TPCDSQueryTestSuite.scala
index 92cf574781f..f3eaa898e59 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/TPCDSQueryTestSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/TPCDSQueryTestSuite.scala
@@ -62,7 +62,7 @@ class TPCDSQueryTestSuite extends QueryTest with TPCDSBase 
with SQLQueryTestHelp
 
   // To make output results deterministic
   override protected def sparkConf: SparkConf = super.sparkConf
-    .set(SQLConf.SHUFFLE_PARTITIONS.key, 4.toString)
+    .set(SQLConf.SHUFFLE_PARTITIONS.key, 16.toString)
 
   protected override def createSparkSession: TestSparkSession = {
     new TestSparkSession(new SparkContext("local[1]", 
this.getClass.getSimpleName, sparkConf))


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to