Yikun commented on pull request #35773:
URL: https://github.com/apache/spark/pull/35773#issuecomment-1063653384
```
[info] VolcanoSuite:
[info] - Run SparkPi with no resources (11 seconds, 363 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (12 seconds,
181 milliseconds)
[info] - Run SparkPi with a very long application name. (10 seconds, 876
milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (11 seconds, 932 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (10 seconds, 756
milliseconds)
[info] - Run SparkPi with an argument. (10 seconds, 989 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment
variables. (13 seconds, 562 milliseconds)
[info] - All pods have the same service account by default (10 seconds, 703
milliseconds)
[info] - Run extraJVMOptions check on driver (5 seconds, 625 milliseconds)
[info] - Run SparkRemoteFileTest using a remote data file (10 seconds, 795
milliseconds)
[info] - Verify logging configuration is picked from the provided
SPARK_CONF_DIR/log4j2.properties (16 seconds, 211 milliseconds)
[info] - Run SparkPi with env and mount secrets. (19 seconds, 830
milliseconds)
[info] - Run PySpark on simple pi.py example (11 seconds, 677 milliseconds)
[info] - Run PySpark to test a pyfiles example (16 seconds, 518 milliseconds)
[info] - Run PySpark with memory customization (11 seconds, 920 milliseconds)
[info] - Run in client mode. (10 seconds, 330 milliseconds)
[info] - Start pod creation from template (13 seconds, 8 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (12 seconds, 59
milliseconds)
[info] - Test basic decommissioning (45 seconds, 509 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (44 seconds, 664
milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups (2
minutes, 43 seconds)
[info] - Test decommissioning timeouts (47 seconds, 531 milliseconds)
[info] - SPARK-37576: Rolling decommissioning (1 minute, 7 seconds)
[info] - Run SparkPi with volcano scheduler (10 seconds, 844 milliseconds)
[info] - SPARK-38187: Run SparkPi Jobs with minCPU (32 seconds, 654
milliseconds)
[info] - SPARK-38187: Run SparkPi Jobs with minMemory (32 seconds, 610
milliseconds)
[info] - SPARK-38188: Run SparkPi jobs with 2 queues (only 1 enabled) (14
seconds, 323 milliseconds)
[info] - SPARK-38188: Run SparkPi jobs with 2 queues (all enabled) (26
seconds, 385 milliseconds)
[info] - SPARK-38423: Run SparkPi Jobs with priorityClassName (20 seconds,
209 milliseconds)
[info] - SPARK-38423: Run driver job to validate priority order (17 seconds,
427 milliseconds)
[info] Run completed in 12 minutes, 58 seconds.
[info] Total number of tests run: 30
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 30, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
$ k get node -oymal
capacity:
cpu: "6"
memory: 9159716Ki
```
All test passed in 6U9G cluster.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]