Yikun commented on pull request #35819:
URL: https://github.com/apache/spark/pull/35819#issuecomment-1066105700


   All test passed with volcano release-1.5 (v1.5.1-beta.0 images) on x86 and 
arm64.
   
   <details><summary>test on arm64 details</summary>
   
   ```
   # arm64
   kubectl apply -f 
https://raw.githubusercontent.com/volcano-sh/volcano/release-1.5/installer/volcano-development-arm64.yaml
   
   build/sbt -Pvolcano -Pkubernetes -Pkubernetes-integration-tests 
-Dtest.exclude.tags=minikube,r  -Dtest.include.tags=volcano  
-Dspark.kubernetes.test.namespace=default 
"kubernetes-integration-tests/testOnly"
   
   [info] VolcanoSuite:
   [info] - Run SparkPi with no resources (12 seconds, 370 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 
520 milliseconds)
   [info] - Run SparkPi with a very long application name. (11 seconds, 927 
milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (11 seconds, 1 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (10 seconds, 899 
milliseconds)
   [info] - Run SparkPi with an argument. (11 seconds, 943 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment 
variables. (11 seconds, 979 milliseconds)
   [info] - All pods have the same service account by default (10 seconds, 962 
milliseconds)
   [info] - Run extraJVMOptions check on driver (6 seconds, 105 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (12 seconds, 17 
milliseconds)
   [info] - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties (19 seconds, 699 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (22 seconds, 314 
milliseconds)
   [info] - Run PySpark on simple pi.py example (13 seconds, 57 milliseconds)
   [info] - Run PySpark to test a pyfiles example (15 seconds, 144 milliseconds)
   [info] - Run PySpark with memory customization (12 seconds, 943 milliseconds)
   [info] - Run in client mode. (8 seconds, 259 milliseconds)
   [info] - Start pod creation from template (13 seconds, 95 milliseconds)
   [info] - SPARK-38398: Schedule pod creation from template (11 seconds, 987 
milliseconds)
   [info] - Test basic decommissioning (46 seconds, 368 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (46 seconds, 699 
milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 
minutes, 46 seconds)
   [info] - Test decommissioning timeouts (47 seconds, 204 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 8 seconds)
   [info] - Run SparkPi with volcano scheduler (13 seconds, 27 milliseconds)
   [info] - SPARK-38187: Run SparkPi Jobs with minCPU (38 seconds, 825 
milliseconds)
   [info] - SPARK-38187: Run SparkPi Jobs with minMemory (38 seconds, 911 
milliseconds)
   [info] - SPARK-38188: Run SparkPi jobs with 2 queues (only 1 enabled) (19 
seconds, 139 milliseconds)
   [info] - SPARK-38188: Run SparkPi jobs with 2 queues (all enabled) (34 
seconds, 263 milliseconds)
   [info] - SPARK-38423: Run driver job to validate priority order (18 seconds, 
491 milliseconds)
   [info] Run completed in 13 minutes, 16 seconds.
   [info] Total number of tests run: 29
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 29, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   [success] Total time: 829 s (13:49), completed 2022-3-13 20:29:27
   ```
   </details>
   
   <details><summary>test on x86 details </summary>
   
   ```
   # x86
   kubectl apply -f 
https://raw.githubusercontent.com/volcano-sh/volcano/release-1.5/installer/volcano-development.yaml
   
   build/sbt -Pvolcano -Pkubernetes -Pkubernetes-integration-tests 
-Dtest.exclude.tags=minikube,r  -Dtest.include.tags=volcano  
-Dspark.kubernetes.test.namespace=default 
"kubernetes-integration-tests/testOnly"
   
   [info] VolcanoSuite:
   [info] - Run SparkPi with no resources (12 seconds, 171 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 
741 milliseconds)
   [info] - Run SparkPi with a very long application name. (12 seconds, 794 
milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (11 seconds, 740 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (11 seconds, 711 
milliseconds)
   [info] - Run SparkPi with an argument. (12 seconds, 789 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment 
variables. (11 seconds, 848 milliseconds)
   [info] - All pods have the same service account by default (12 seconds, 756 
milliseconds)
   [info] - Run extraJVMOptions check on driver (6 seconds, 661 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (12 seconds, 881 
milliseconds)
   [info] - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties (19 seconds, 192 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (20 seconds, 721 
milliseconds)
   [info] - Run PySpark on simple pi.py example (13 seconds, 776 milliseconds)
   [info] - Run PySpark to test a pyfiles example (15 seconds, 767 milliseconds)
   [info] - Run PySpark with memory customization (13 seconds, 738 milliseconds)
   [info] - Run in client mode. (9 seconds, 176 milliseconds)
   [info] - Start pod creation from template (11 seconds, 792 milliseconds)
   [info] - SPARK-38398: Schedule pod creation from template (12 seconds, 831 
milliseconds)
   [info] - Test basic decommissioning (47 seconds, 65 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (48 seconds, 200 
milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 
minutes, 46 seconds)
   [info] - Test decommissioning timeouts (47 seconds, 423 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 10 seconds)
   [info] - Run SparkPi with volcano scheduler (11 seconds, 695 milliseconds)
   [info] - SPARK-38187: Run SparkPi Jobs with minCPU (34 seconds, 328 
milliseconds)
   [info] - SPARK-38187: Run SparkPi Jobs with minMemory (32 seconds, 235 
milliseconds)
   [info] - SPARK-38188: Run SparkPi jobs with 2 queues (only 1 enabled) (18 
seconds, 230 milliseconds)
   [info] - SPARK-38188: Run SparkPi jobs with 2 queues (all enabled) (29 
seconds, 370 milliseconds)
   [info] - SPARK-38423: Run driver job to validate priority order (19 seconds, 
213 milliseconds)
   [info] Run completed in 13 minutes, 13 seconds.
   [info] Total number of tests run: 29
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 29, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   [success] Total time: 837 s (13:57), completed Mar 13, 2022 8:29:45 PM
   ```
   </details>
   
   Ready to go!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to