This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.3 by this push:
     new e6a008f  [MINOR][DOC] Use `Java 8` instead of `Java 8+` as a running 
environment
e6a008f is described below

commit e6a008f5cd39f5259cb4e289bc564a8554fdb277
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Aug 15 11:22:57 2019 -0700

    [MINOR][DOC] Use `Java 8` instead of `Java 8+` as a running environment
    
    After Apache Spark 3.0.0 supports JDK11 officially, people will try JDK11 
on old Spark releases (especially 2.4.4/2.3.4) in the same way because our 
document says `Java 8+`. We had better avoid that misleading situation.
    
    This PR aims to remove `+` from `Java 8+` in the documentation 
(master/2.4/2.3). Especially, 2.4.4 release and 2.3.4 release (cc kiszk )
    
    On master branch, we will add JDK11 after 
[SPARK-24417.](https://issues.apache.org/jira/browse/SPARK-24417)
    
    This is a documentation only change.
    
    <img width="923" alt="java8" 
src="https://user-images.githubusercontent.com/9700541/63116589-e1504800-bf4e-11e9-8904-b160ec7a42c0.png";>
    
    Closes #25466 from dongjoon-hyun/SPARK-DOC-JDK8.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit 123eb58d61ad7c7ebe540f1634088696a3cc85bc)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 docs/index.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/index.md b/docs/index.md
index 2f00941..09dce56 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -26,7 +26,7 @@ Spark runs on both Windows and UNIX-like systems (e.g. Linux, 
Mac OS). It's easy
 locally on one machine --- all you need is to have `java` installed on your 
system `PATH`,
 or the `JAVA_HOME` environment variable pointing to a Java installation.
 
-Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 
{{site.SPARK_VERSION}}
+Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 
{{site.SPARK_VERSION}}
 uses Scala {{site.SCALA_BINARY_VERSION}}. You will need to use a compatible 
Scala version
 ({{site.SCALA_BINARY_VERSION}}.x).
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to