Qian Sun created SPARK-40969:
--------------------------------

             Summary: Unable to download spark 3.3.0 tarball after 3.3.1 
release in spark-docker
                 Key: SPARK-40969
                 URL: https://issues.apache.org/jira/browse/SPARK-40969
             Project: Spark
          Issue Type: Bug
          Components: Spark Docker
    Affects Versions: 3.3.1
            Reporter: Qian Sun


Unable to download spark 3.3.0 tarball in spark-docker. 


{code:sh}
#7 0.229 + wget -nv -O spark.tgz 
https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz
#7 1.061 https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz:
#7 1.061 2022-10-31 02:59:20 ERROR 404: Not Found.
------
executor failed running [/bin/sh -c set -ex;     export SPARK_TMP="$(mktemp 
-d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget 
-nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)"; 
    gpg --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg 
--keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch 
--verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf 
"$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     
chown -R spark:spark .;     mv jars /opt/spark/;     mv bin /opt/spark/;     mv 
sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv 
examples /opt/spark/;     mv kubernetes/tests /opt/spark/;     mv data 
/opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv 
python/lib /opt/spark/python/lib/;     cd ..;     rm -rf "$SPARK_TMP";]: exit 
code: 8
{code}
And spark 3.3.1 docker is ok

{code:sh}
=> [4/9] RUN set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;   
  wget -nv -O spark.tgz 
"https://dlcdn.apache.org/spark/spark-3.3.1/spark-3.3.1-bin-hadoop3.tgz";;     
wget -nv -O spark.tgz.asc "https://downlo  77.8s
 => [5/9] COPY entrypoint.sh /opt/
{code}




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to