This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 041883f6b90 [SPARK-42154][K8S][TESTS] Enable `Volcano` unit and 
integration tests in GitHub Action
041883f6b90 is described below

commit 041883f6b904e4c40ab670a168c8bcbd6125b99d
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sun Jan 22 15:29:29 2023 -0800

    [SPARK-42154][K8S][TESTS] Enable `Volcano` unit and integration tests in 
GitHub Action
    
    ### What changes were proposed in this pull request?
    
    This PR aims to enable `Volcano` test coverage in terms of both unit tests 
and integration tests from Apache Spark 3.4.
    
    ### Why are the changes needed?
    
    To protect any regression in `Volcano` feature.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs and check the GitHub Action log manually.
    
    - 
https://github.com/dongjoon-hyun/spark/actions/runs/3978081926/jobs/6825017532
    ```
    [info] VolcanoFeatureStepSuite:
    [info] - SPARK-36061: Driver Pod with Volcano PodGroup (416 milliseconds)
    [info] - SPARK-36061: Executor Pod with Volcano PodGroup (3 milliseconds)
    [info] - SPARK-38455: Support driver podgroup template (473 milliseconds)
    [info] - SPARK-38503: return empty for executor pre resource (3 
milliseconds)
    ```
    
    - 
https://github.com/dongjoon-hyun/spark/actions/runs/3978081926/jobs/6825017770
    ```
    [info] VolcanoSuite:
    [info] - Run SparkPi with no resources (17 seconds, 420 milliseconds)
    [info] - Run SparkPi with no resources & statefulset allocation (17 
seconds, 225 milliseconds)
    [info] - Run SparkPi with a very long application name. (17 seconds, 628 
milliseconds)
    [info] - Use SparkLauncher.NO_RESOURCE (19 seconds, 794 milliseconds)
    [info] - Run SparkPi with a master URL without a scheme. (18 seconds, 228 
milliseconds)
    [info] - Run SparkPi with an argument. (18 seconds, 206 milliseconds)
    [info] - Run SparkPi with custom labels, annotations, and environment 
variables. (17 seconds, 999 milliseconds)
    [info] - All pods have the same service account by default (17 seconds, 958 
milliseconds)
    [info] - Run extraJVMOptions check on driver (9 seconds, 144 milliseconds)
    [info] - Run SparkRemoteFileTest using a remote data file (18 seconds, 2 
milliseconds)
    [info] - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties (18 seconds, 552 milliseconds)
    [info] - Run SparkPi with env and mount secrets. (35 seconds, 631 
milliseconds)
    [info] - Run PySpark on simple pi.py example (20 seconds, 407 milliseconds)
    [info] - Run PySpark to test a pyfiles example (24 seconds, 605 
milliseconds)
    [info] - Run PySpark with memory customization (19 seconds, 169 
milliseconds)
    [info] - Run in client mode. (36 seconds, 688 milliseconds)
    [info] - Start pod creation from template (18 seconds, 996 milliseconds)
    [info] - SPARK-38398: Schedule pod creation from template (19 seconds, 239 
milliseconds)
    [info] - PVs with local hostpath storage on statefulsets (25 seconds, 285 
milliseconds)
    [info] - PVs with local hostpath and storageClass on statefulsets (24 
seconds, 979 milliseconds)
    [info] - PVs with local storage (39 seconds, 906 milliseconds)
    [info] - Launcher client dependencies (46 seconds, 230 milliseconds)
    [info] - SPARK-40817: Check that remote files do not get discarded in 
spark.files (43 seconds, 76 milliseconds)
    [info] - SPARK-33615: Launcher client archives (43 seconds, 525 
milliseconds)
    [info] - SPARK-33748: Launcher python client respecting PYSPARK_PYTHON (47 
seconds, 945 milliseconds)
    [info] - SPARK-33748: Launcher python client respecting 
spark.pyspark.python and spark.pyspark.driver.python (49 seconds, 467 
milliseconds)
    [info] - Launcher python client dependencies using a zip file (50 seconds, 
89 milliseconds)
    [info] - Test basic decommissioning (1 minute, 5 seconds)
    [info] - Test basic decommissioning with shuffle cleanup (1 minute, 6 
seconds)
    [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 
minutes, 59 seconds)
    [info] - Test decommissioning timeouts (1 minute, 6 seconds)
    [info] - SPARK-37576: Rolling decommissioning (1 minute, 11 seconds)
    [info] - Run SparkR on simple dataframe.R example (25 seconds, 828 
milliseconds)
    [info] - Run SparkPi with volcano scheduler (19 seconds, 192 milliseconds)
    [info] - SPARK-38187: Run SparkPi Jobs with minCPU (1 minute, 1 second)
    [info] - SPARK-38187: Run SparkPi Jobs with minMemory (1 minute, 1 second)
    [info] - SPARK-38188: Run SparkPi jobs with 2 queues (only 1 enabled) (25 
seconds, 244 milliseconds)
    [info] - SPARK-38188: Run SparkPi jobs with 2 queues (all enabled) (38 
seconds, 315 milliseconds)
    [info] - SPARK-38423: Run driver job to validate priority order (38 
seconds, 520 milliseconds)
    ```
    
    Closes #39697 from dongjoon-hyun/SPARK-42154.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .github/workflows/build_and_test.yml | 4 ++--
 dev/sparktestsupport/modules.py      | 2 +-
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 31baf46dbe4..46cb7fe27db 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -952,9 +952,9 @@ jobs:
           export PVC_TESTS_VM_PATH=$PVC_TMP_DIR
           minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} --gid=0 
--uid=185 &
           kubectl create clusterrolebinding serviceaccounts-cluster-admin 
--clusterrole=cluster-admin --group=system:serviceaccounts || true
+          kubectl apply -f 
https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml
 || true
           eval $(minikube docker-env)
-          # - Exclude Volcano test (-Pvolcano), batch jobs need more CPU 
resource
-          build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests 
-Dspark.kubernetes.test.driverRequestCores=0.5 
-Dspark.kubernetes.test.executorRequestCores=0.2 
"kubernetes-integration-tests/test"
+          build/sbt -Psparkr -Pkubernetes -Pvolcano 
-Pkubernetes-integration-tests -Dspark.kubernetes.test.driverRequestCores=0.5 
-Dspark.kubernetes.test.executorRequestCores=0.2 
-Dspark.kubernetes.test.volcanoMaxConcurrencyJobNum=1 
"kubernetes-integration-tests/test"
       - name: Upload Spark on K8S integration tests log files
         if: failure()
         uses: actions/upload-artifact@v3
diff --git a/dev/sparktestsupport/modules.py b/dev/sparktestsupport/modules.py
index 634be286065..f95432c658d 100644
--- a/dev/sparktestsupport/modules.py
+++ b/dev/sparktestsupport/modules.py
@@ -824,7 +824,7 @@ kubernetes = Module(
     name="kubernetes",
     dependencies=[],
     source_file_regexes=["resource-managers/kubernetes"],
-    build_profile_flags=["-Pkubernetes"],
+    build_profile_flags=["-Pkubernetes", "-Pvolcano"],
     sbt_test_goals=["kubernetes/test"],
 )
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to