This is an automated email from the ASF dual-hosted git repository.

yao pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new c6fb226d71 Update docker links on the download page (#522)
c6fb226d71 is described below

commit c6fb226d7197e6c2e3a3922c04c506cd0cc6cee1
Author: Kent Yao <[email protected]>
AuthorDate: Tue Jun 11 18:52:26 2024 +0800

    Update docker links on the download page (#522)
---
 downloads.md        | 6 ++++--
 site/downloads.html | 6 ++++--
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/downloads.md b/downloads.md
index 6598534668..43312a2895 100644
--- a/downloads.md
+++ b/downloads.md
@@ -41,9 +41,11 @@ Spark artifacts are [hosted in Maven 
Central](https://search.maven.org/search?q=
 <a href="https://pypi.org/project/pyspark/";>PySpark</a> is now available in 
pypi. To install just run `pip install pyspark`.
 
 
-### Convenience Docker Container Images
+### Installing with Docker
 
-[Spark Docker Container images are available from 
DockerHub](https://hub.docker.com/r/apache/spark-py/tags), these images contain 
non-ASF software and may be subject to different license terms.
+Spark docker images are available from Dockerhub under the accounts of both 
[The Apache Software Foundation](https://hub.docker.com/r/apache/spark/) and 
[Official Images](https://hub.docker.com/_/spark).
+
+Note that, these images contain non-ASF software and may be subject to 
different license terms. Please check their 
[Dockerfiles](https://github.com/apache/spark-docker) to verify whether to 
verify whether they are compatible with your deployment.
 
 ### Release notes for stable releases
 
diff --git a/site/downloads.html b/site/downloads.html
index 77baa1a1fe..fad86b7f58 100644
--- a/site/downloads.html
+++ b/site/downloads.html
@@ -182,9 +182,11 @@ version: 3.5.1
 <h3 id="installing-with-pypi">Installing with PyPi</h3>
 <p><a href="https://pypi.org/project/pyspark/";>PySpark</a> is now available in 
pypi. To install just run <code class="language-plaintext 
highlighter-rouge">pip install pyspark</code>.</p>
 
-<h3 id="convenience-docker-container-images">Convenience Docker Container 
Images</h3>
+<h3 id="installing-with-docker">Installing with Docker</h3>
 
-<p><a href="https://hub.docker.com/r/apache/spark-py/tags";>Spark Docker 
Container images are available from DockerHub</a>, these images contain non-ASF 
software and may be subject to different license terms.</p>
+<p>Spark docker images are available from Dockerhub under the accounts of both 
<a href="https://hub.docker.com/r/apache/spark/";>The Apache Software 
Foundation</a> and <a href="https://hub.docker.com/_/spark";>Official 
Images</a>.</p>
+
+<p>Note that, these images contain non-ASF software and may be subject to 
different license terms. Please check their <a 
href="https://github.com/apache/spark-docker";>Dockerfiles</a> to verify whether 
to verify whether they are compatible with your deployment.</p>
 
 <h3 id="release-notes-for-stable-releases">Release notes for stable 
releases</h3>
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to