[
https://issues.apache.org/jira/browse/SPARK-50199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Dongjoon Hyun closed SPARK-50199.
---------------------------------
> Use Spark 3.4.4 instead of 3.0.1 in `test_install_spark`
> --------------------------------------------------------
>
> Key: SPARK-50199
> URL: https://issues.apache.org/jira/browse/SPARK-50199
> Project: Spark
> Issue Type: Sub-task
> Components: PySpark, Tests
> Affects Versions: 3.5.3, 4.0.0
> Reporter: Dongjoon Hyun
> Assignee: Dongjoon Hyun
> Priority: Minor
> Labels: pull-request-available
> Fix For: 3.5.4, 4.0.0
>
>
> [https://github.com/apache/spark/actions/runs/11623974780/job/32371883850]
>
> {code:java}
> urllib.error.URLError: <urlopen error [Errno 110] Connection timed out>
> ERROR
> test_package_name
> (pyspark.tests.test_install_spark.SparkInstallationTestCase) ... Trying to
> download Spark spark-3.0.1 from [https://dlcdn.apache.org/,
> https://archive.apache.org/dist, https://dist.apache.org/repos/dist/release]
> Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
> - https://dlcdn.apache.org//spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
> Failed to download spark-3.0.1 for Hadoop hadoop3.2 from
> https://dlcdn.apache.org//spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
> Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
> -
> https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
> Failed to download spark-3.0.1 for Hadoop hadoop3.2 from
> https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
> Downloading spark-3.0.1 for Hadoop hadoop3.2 from:
> -
> https://dist.apache.org/repos/dist/release/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz
> Failed to download spark-3.0.1 for Hadoop hadoop3.2 from
> https://dist.apache.org/repos/dist/release/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz:
> ok {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]