This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new fcfbf8e660db [SPARK-50199][PYTHON][TESTS] Use Spark 3.4.4 instead of 
3.0.1 in `test_install_spark`
fcfbf8e660db is described below

commit fcfbf8e660db0396096e4534d59595efdf358058
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Oct 31 23:42:06 2024 -0700

    [SPARK-50199][PYTHON][TESTS] Use Spark 3.4.4 instead of 3.0.1 in 
`test_install_spark`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to use Spark 3.4.4 instead of 3.0.1 in `test_install_spark`.
    
    Since Spark 3.4.4 is the End-Of-Life release, it will be in `dlcdc`, 
`archive`, and `dist` channel until Apache Spark 4.0 release. Previously, 3.0.1 
exists only in `archive` and causes flaky failures.
    
    ### Why are the changes needed?
    
    To reduce the flakiness.
    
    - https://github.com/apache/spark/actions/runs/11623974780/job/32371883850
     ```
    urllib.error.URLError: <urlopen error [Errno 110] Connection timed out>
    ERROR
    test_package_name 
(pyspark.tests.test_install_spark.SparkInstallationTestCase) ... Trying to 
download Spark spark-3.0.1 from [https://dlcdn.apache.org/, 
https://archive.apache.org/dist, https://dist.apache.org/repos/dist/release]
    Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
    - https://dlcdn.apache.org//spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
    Failed to download spark-3.0.1 for Hadoop hadoop3.2 from 
https://dlcdn.apache.org//spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
    Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
    - 
https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
    Failed to download spark-3.0.1 for Hadoop hadoop3.2 from 
https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
    Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
    - 
https://dist.apache.org/repos/dist/release/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
    Failed to download spark-3.0.1 for Hadoop hadoop3.2 from 
https://dist.apache.org/repos/dist/release/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
    ok
    ```
    
    **AFTER**
    ```
    test_install_spark 
(pyspark.tests.test_install_spark.SparkInstallationTestCase) ... Trying to 
download Spark spark-3.4.4 from [https://dlcdn.apache.org/, 
https://archive.apache.org/dist, https://dist.apache.org/repos/dist/release]
    Downloading spark-3.4.4 for Hadoop hadoop3 from:
    - https://dlcdn.apache.org//spark/spark-3.4.4/spark-3.4.4-bin-hadoop3.tgz
    Downloaded 1048576 of 388988563 bytes (0.27%)
    ...
    ```
    
    Since Spark 3.4.4 is the EOL version, it will be in `download.apache.org` 
until Apache Spark 4.0.0 release.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #48733 from dongjoon-hyun/SPARK-50199.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/tests/test_install_spark.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/python/pyspark/tests/test_install_spark.py 
b/python/pyspark/tests/test_install_spark.py
index dee28af9a407..d46d55e02284 100644
--- a/python/pyspark/tests/test_install_spark.py
+++ b/python/pyspark/tests/test_install_spark.py
@@ -32,7 +32,7 @@ class SparkInstallationTestCase(unittest.TestCase):
     def test_install_spark(self):
         # Test only one case. Testing this is expensive because it needs to 
download
         # the Spark distribution.
-        spark_version, hadoop_version, hive_version = 
checked_versions("3.0.1", "3", "2.3")
+        spark_version, hadoop_version, hive_version = 
checked_versions("3.4.4", "3", "2.3")
 
         with tempfile.TemporaryDirectory(prefix="test_install_spark") as 
tmp_dir:
             install_spark(


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to