This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.5
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.5 by this push:
new 1ddf4a9f3393 [MINOR][DOCS][3.5] Fix specified java versions in
`install.rst`
1ddf4a9f3393 is described below
commit 1ddf4a9f33937f3cabcdb352837bb42dcaf94250
Author: d.vorst <[email protected]>
AuthorDate: Mon Nov 4 13:44:17 2024 -0800
[MINOR][DOCS][3.5] Fix specified java versions in `install.rst`
### What changes were proposed in this pull request?
Documentation change: the JAVA versions mentioned on the getting_started
page of PySpark 3.5 are corrected
(https://spark.apache.org/docs/3.5.3/api/python/getting_started/install.html#dependencies).
### Why are the changes needed?
The original description "PySpark requires Java 8 or later" is incorrect
since 3.5 does not support java prior to 8u371 anymore and the latest supported
version is 17, the downloading page
(https://spark.apache.org/docs/3.5.3/#downloading) however, does correctly
state this. I thus corrected the mentioned java versions.
### Does this PR introduce _any_ user-facing change?
Yes
documentation fix
### How was this patch tested?
Manually
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #48411 from dvorst/branch-3.5.
Authored-by: d.vorst <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
python/docs/source/getting_started/install.rst | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/python/docs/source/getting_started/install.rst
b/python/docs/source/getting_started/install.rst
index e97632a8b384..287e65400861 100644
--- a/python/docs/source/getting_started/install.rst
+++ b/python/docs/source/getting_started/install.rst
@@ -164,7 +164,7 @@ Package Supported version Note
`googleapis-common-protos` ==1.56.4 Required for Spark Connect
========================== =========================
======================================================================================
-Note that PySpark requires Java 8 or later with ``JAVA_HOME`` properly set.
+Note that PySpark requires Java 8 (except prior to 8u371), 11 or 17 with
``JAVA_HOME`` properly set.
If using JDK 11, set ``-Dio.netty.tryReflectionSetAccessible=true`` for Arrow
related features and refer
to |downloading|_.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]