Repository: zeppelin Updated Branches: refs/heads/master 8dde8fb9a -> c5ab10ddd
ZEPPELIN-1599 Remove support on some old versions of Spark. ### What is this PR for? removing support on old versions of Spark including testing and building them. ### What type of PR is it? [Feature] ### Todos * [x] - Remove .travis.yml * [x] - Remove pom.xml * [x] - Remove some docs ### What is the Jira issue? * https://issues.apache.org/jira/browse/ZEPPELIN-1599 ### How should this be tested? No test. Check travis simplified ### Screenshots (if appropriate) ### Questions: * Does the licenses files need update? No * Is there breaking changes for older versions? You cannot use spark from 1.1 to 1.3 any longer * Does this needs documentation? Yes, should remove some docs Removed some profiles concerning old versions of Spark Author: Jongyoul Lee <[email protected]> Closes #1578 from jongyoul/ZEPPELIN-1599 and squashes the following commits: acf514f [Jongyoul Lee] Fixed the script not for recognizing old versions 4bc11d6 [Jongyoul Lee] Added some docs for the deprecation on support for old versions of Spark 207502d [Jongyoul Lee] Removed some tests for old versions of Spark Removed some profiles concerning old versions of Spark Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/c5ab10dd Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/c5ab10dd Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/c5ab10dd Branch: refs/heads/master Commit: c5ab10ddd421869d99472a80f97563eb2fca233f Parents: 8dde8fb Author: Jongyoul Lee <[email protected]> Authored: Thu Nov 3 18:48:29 2016 +0900 Committer: Mina Lee <[email protected]> Committed: Sat Nov 5 00:02:11 2016 +0900 ---------------------------------------------------------------------- .travis.yml | 12 ------------ README.md | 5 +---- docs/install/upgrade.md | 1 + spark/pom.xml | 32 -------------------------------- testing/downloadSpark.sh | 2 +- 5 files changed, 3 insertions(+), 49 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c5ab10dd/.travis.yml ---------------------------------------------------------------------- diff --git a/.travis.yml b/.travis.yml index 80d4c04..3097593 100644 --- a/.travis.yml +++ b/.travis.yml @@ -58,18 +58,6 @@ matrix: - jdk: "oraclejdk7" env: SCALA_VER="2.10" SPARK_VER="1.4.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.4 -Pr -Phadoop-2.3 -Ppyspark -Psparkr" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,r -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false" - # Test spark module for 1.3.1 - - jdk: "oraclejdk7" - env: SCALA_VER="2.10" SPARK_VER="1.3.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.3 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false" - - # Test spark module for 1.2.2 - - jdk: "oraclejdk7" - env: SCALA_VER="2.10" SPARK_VER="1.2.2" HADOOP_VER="2.3" PROFILE="-Pspark-1.2 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false" - - # Test spark module for 1.1.1 - - jdk: "oraclejdk7" - env: SCALA_VER="2.10" SPARK_VER="1.1.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.1 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false" - # Test selenium with spark module for 1.6.1 - jdk: "oraclejdk7" env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.6 -Phadoop-2.3 -Ppyspark -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false" http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c5ab10dd/README.md ---------------------------------------------------------------------- diff --git a/README.md b/README.md index be53b55..35cc911 100644 --- a/README.md +++ b/README.md @@ -128,9 +128,6 @@ Available profiles are -Pspark-1.6 -Pspark-1.5 -Pspark-1.4 --Pspark-1.3 --Pspark-1.2 --Pspark-1.1 -Pcassandra-spark-1.5 -Pcassandra-spark-1.4 -Pcassandra-spark-1.3 @@ -192,7 +189,7 @@ enable 3rd party vendor repository (cloudera) ##### `-Pmapr[version]` (optional) -For the MapR Hadoop Distribution, these profiles will handle the Hadoop version. As MapR allows different versions of Spark to be installed, you should specify which version of Spark is installed on the cluster by adding a Spark profile (`-Pspark-1.2`, `-Pspark-1.3`, etc.) as needed. +For the MapR Hadoop Distribution, these profiles will handle the Hadoop version. As MapR allows different versions of Spark to be installed, you should specify which version of Spark is installed on the cluster by adding a Spark profile (`-Pspark-1.6`, `-Pspark-2.0`, etc.) as needed. The correct Maven artifacts can be found for every version of MapR at http://doc.mapr.com Available profiles are http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c5ab10dd/docs/install/upgrade.md ---------------------------------------------------------------------- diff --git a/docs/install/upgrade.md b/docs/install/upgrade.md index c218b4c..a203f6f 100644 --- a/docs/install/upgrade.md +++ b/docs/install/upgrade.md @@ -52,3 +52,4 @@ So, copying `notebook` and `conf` directory should be enough. - From 0.7, we don't use `ZEPPELIN_JAVA_OPTS` as default value of `ZEPPELIN_INTP_JAVA_OPTS` and also the same for `ZEPPELIN_MEM`/`ZEPPELIN_INTP_MEM`. If user want to configure the jvm opts of interpreter process, please set `ZEPPELIN_INTP_JAVA_OPTS` and `ZEPPELIN_INTP_MEM` explicitly. If you don't set `ZEPPELIN_INTP_MEM`, Zeppelin will set it to `-Xms1024m -Xmx1024m -XX:MaxPermSize=512m` by default. - Mapping from `%jdbc(prefix)` to `%prefix` is no longer available. Instead, you can use %[interpreter alias] with multiple interpreter setttings on GUI. - Usage of `ZEPPELIN_PORT` is not supported in ssl mode. Instead use `ZEPPELIN_SSL_PORT` to configure the ssl port. Value from `ZEPPELIN_PORT` is used only when `ZEPPELIN_SSL` is set to `false`. + - The support on Spark 1.1.x to 1.3.x is deprecated. http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c5ab10dd/spark/pom.xml ---------------------------------------------------------------------- diff --git a/spark/pom.xml b/spark/pom.xml index efb7452..46a46f1 100644 --- a/spark/pom.xml +++ b/spark/pom.xml @@ -451,38 +451,6 @@ <profiles> <profile> - <id>spark-1.1</id> - <dependencies> - - </dependencies> - <properties> - <spark.version>1.1.1</spark.version> - <akka.version>2.2.3-shaded-protobuf</akka.version> - </properties> - </profile> - - <profile> - <id>spark-1.2</id> - <dependencies> - </dependencies> - <properties> - <spark.version>1.2.1</spark.version> - </properties> - </profile> - - <profile> - <id>spark-1.3</id> - - <properties> - <spark.version>1.3.1</spark.version> - </properties> - - <dependencies> - </dependencies> - - </profile> - - <profile> <id>spark-1.4</id> <properties> <spark.version>1.4.1</spark.version> http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c5ab10dd/testing/downloadSpark.sh ---------------------------------------------------------------------- diff --git a/testing/downloadSpark.sh b/testing/downloadSpark.sh index 0575284..45b0b36 100755 --- a/testing/downloadSpark.sh +++ b/testing/downloadSpark.sh @@ -76,7 +76,7 @@ if [[ ! -d "${SPARK_HOME}" ]]; then echo "${SPARK_CACHE} does not have ${SPARK_ARCHIVE} downloading ..." # download archive if not cached - if [[ "${SPARK_VERSION}" = "1.1.1" || "${SPARK_VERSION}" = "1.2.2" || "${SPARK_VERSION}" = "1.3.1" || "${SPARK_VERSION}" = "1.4.1" ]]; then + if [[ "${SPARK_VERSION}" = "1.4.1" ]]; then echo "${SPARK_VERSION} being downloaded from archives" # spark old versions are only available only on the archives (prior to 1.5.2) STARTTIME=`date +%s`
