dcoliversun commented on PR #36333: URL: https://github.com/apache/spark/pull/36333#issuecomment-1107792294
Pass IT test ```plain $ build/sbt -Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=r -Dspark.kubernetes.test.imageRepo=kubespark "kubernetes-integration-tests/test" [info] KubernetesSuite: [info] - Run SparkPi with no resources (16 seconds, 928 milliseconds) [info] - Run SparkPi with no resources & statefulset allocation (15 seconds, 302 milliseconds) [info] - Run SparkPi with a very long application name. (15 seconds, 495 milliseconds) [info] - Use SparkLauncher.NO_RESOURCE (15 seconds, 706 milliseconds) [info] - Run SparkPi with a master URL without a scheme. (13 seconds, 751 milliseconds) [info] - Run SparkPi with an argument. (13 seconds, 437 milliseconds) [info] - Run SparkPi with custom labels, annotations, and environment variables. (13 seconds, 311 milliseconds) [info] - All pods have the same service account by default (13 seconds, 899 milliseconds) [info] - Run extraJVMOptions check on driver (7 seconds, 497 milliseconds) [info] - Run SparkRemoteFileTest using a remote data file (18 seconds, 99 milliseconds) [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (16 seconds, 843 milliseconds) [info] - Run SparkPi with env and mount secrets. (29 seconds, 539 milliseconds) [info] - Run PySpark on simple pi.py example (23 seconds, 127 milliseconds) [info] - Run PySpark to test a pyfiles example (19 seconds, 417 milliseconds) [info] - Run PySpark with memory customization (17 seconds, 704 milliseconds) [info] - Run in client mode. (10 seconds, 484 milliseconds) [info] - Start pod creation from template (15 seconds, 88 milliseconds) [info] - SPARK-38398: Schedule pod creation from template (26 seconds, 416 milliseconds) [info] - PVs with local hostpath storage on statefulsets (17 seconds, 321 milliseconds) [info] - PVs with local hostpath and storageClass on statefulsets (17 seconds, 712 milliseconds) [info] - PVs with local storage (18 seconds, 933 milliseconds) [info] - Launcher client dependencies (2 minutes, 34 seconds) [info] - SPARK-33615: Launcher client archives (1 minute, 50 seconds) [info] - SPARK-33748: Launcher python client respecting PYSPARK_PYTHON (1 minute, 41 seconds) [info] - SPARK-33748: Launcher python client respecting spark.pyspark.python and spark.pyspark.driver.python (1 minute, 40 seconds) [info] - Launcher python client dependencies using a zip file (1 minute, 41 seconds) [info] - Test basic decommissioning (52 seconds, 329 milliseconds) [info] - Test basic decommissioning with shuffle cleanup (49 seconds, 25 milliseconds) [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 52 seconds) [info] - Test decommissioning timeouts (48 seconds, 306 milliseconds) [info] - SPARK-37576: Rolling decommissioning (1 minute, 9 seconds) [info] Run completed in 27 minutes, 27 seconds. [info] Total number of tests run: 31 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 31, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [success] Total time: 2159 s (35:59), completed 2022-4-24 16:41:04 ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
