attilapiros opened a new pull request #33233:
URL: https://github.com/apache/spark/pull/33233


   ### What changes were proposed in this pull request?
   
   Upgrading the kubernetes-client to 5.5.0
   
   ### Why are the changes needed?
   
   There are [several 
bugfixes](https://github.com/fabric8io/kubernetes-client/releases/tag/v5.5.0) 
but the main reason is version 5.5.0 contains [Support HTTP operation retry 
with exponential backoff (for status code >= 
500)](https://github.com/fabric8io/kubernetes-client/issues/3087). 
   
   ### Does this PR introduce _any_ user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   By running the integration tests including `persistentVolume` tests:
   
   ```
    
./resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh
 \
       --spark-tgz $TARBALL_TO_TEST --hadoop-profile $HADOOP_PROFILE 
--exclude-tags r --include-tags persistentVolume
   ...
   [INFO] --- scalatest-maven-plugin:2.0.0:test (integration-test) @ 
spark-kubernetes-integration-tests_2.12 ---
   Discovery starting.
   Discovery completed in 413 milliseconds.
   Run starting. Expected test count is: 26
   KubernetesSuite:
   - Run SparkPi with no resources
   - Run SparkPi with a very long application name.
   - Use SparkLauncher.NO_RESOURCE
   - Run SparkPi with a master URL without a scheme.
   - Run SparkPi with an argument.
   - Run SparkPi with custom labels, annotations, and environment variables.
   - All pods have the same service account by default
   - Run extraJVMOptions check on driver
   - Run SparkRemoteFileTest using a remote data file
   - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j.properties
   - Run SparkPi with env and mount secrets.
   - Run PySpark on simple pi.py example
   - Run PySpark to test a pyfiles example
   - Run PySpark with memory customization
   - Run in client mode.
   - Start pod creation from template
   - PVs with local storage
   - Launcher client dependencies
   - SPARK-33615: Launcher client archives
   - SPARK-33748: Launcher python client respecting PYSPARK_PYTHON
   - SPARK-33748: Launcher python client respecting spark.pyspark.python and 
spark.pyspark.driver.python
   - Launcher python client dependencies using a zip file
   - Test basic decommissioning
   - Test basic decommissioning with shuffle cleanup
   - Test decommissioning with dynamic allocation & shuffle cleanups
   - Test decommissioning timeouts
   Run completed in 18 minutes, 34 seconds.
   Total number of tests run: 26
   Suites: completed 2, aborted 0
   Tests: succeeded 26, failed 0, canceled 0, ignored 0, pending 0
   All tests passed.
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to