This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new 86b7621  [MINOR][DOCS] Add Scala 2.13 to index.md and build-spark.md
86b7621 is described below

commit 86b76211007aa5db09d283eae2d98bd12846a0bd
Author: William Hyun <[email protected]>
AuthorDate: Thu Nov 11 18:51:43 2021 -0800

    [MINOR][DOCS] Add Scala 2.13 to index.md and build-spark.md
    
    This PR aims to add Scala 2.13 to index.md and build-spark.md
    
    Since Spark 3.2, Scala 2.13 is supported.
    
    No.
    
    Manually review the generated webpage.
    
    Closes #34562 from williamhyun/web.
    
    Authored-by: William Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit d018503c2d6604b4ef5d7906fa8d445b07e6d056)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 docs/building-spark.md | 2 +-
 docs/index.md          | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/building-spark.md b/docs/building-spark.md
index 1d6a96e..dede456 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -28,7 +28,7 @@ license: |
 
 The Maven-based build is the build of reference for Apache Spark.
 Building Spark using Maven requires Maven 3.6.3 and Java 8.
-Spark requires Scala 2.12; support for Scala 2.11 was removed in Spark 3.0.0.
+Spark requires Scala 2.12/2.13; support for Scala 2.11 was removed in Spark 
3.0.0.
 
 ### Setting up Maven's Memory Usage
 
diff --git a/docs/index.md b/docs/index.md
index 2e6fc99..4784d73 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -39,7 +39,7 @@ source, visit [Building Spark](building-spark.html).
 
 Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it 
should run on any platform that runs a supported version of Java. This should 
include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- 
all you need is to have `java` installed on your system `PATH`, or the 
`JAVA_HOME` environment variable pointing to a Java installation.
 
-Spark runs on Java 8/11, Scala 2.12, Python 3.6+ and R 3.5+.
+Spark runs on Java 8/11, Scala 2.12/2.13, Python 3.6+ and R 3.5+.
 Python 3.6 support is deprecated as of Spark 3.2.0.
 Java 8 prior to version 8u201 support is deprecated as of Spark 3.2.0.
 For the Scala API, Spark {{site.SPARK_VERSION}}

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to