This is an automated email from the ASF dual-hosted git repository. felixcheung pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/zeppelin.git
The following commit(s) were added to refs/heads/master by this push: new 7bf79d8 [ZEPPELIN-3932] spark_mesos Dockerfile should be updated (#3279) 7bf79d8 is described below commit 7bf79d80f5beaaafa3c0e18a96654e43a20a41f9 Author: keineahnung2345 <mimifasosofamire1...@gmail.com> AuthorDate: Sat Jan 5 13:15:03 2019 +0800 [ZEPPELIN-3932] spark_mesos Dockerfile should be updated (#3279) ### What is this PR for? In the [spark_mesos example - Dockerfile](https://github.com/apache/zeppelin/blob/master/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile), there are some issues. 1. The original used SPARK_VERSION=2.1.2 is not available now, update it to 2.4.0 2. There is no package named libevent2-devel, it should be corrected to libevent-devel 3. Update from centos6 to centos7 4. Update from jdk7 to jdk8 In the [spark_mesos example - entrypoint.sh](https://github.com/apache/zeppelin/blob/master/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh): 1. Follow the instructions: [zeppelin on spark mesos mode - configure spark interpreter in zeppelin](https://zeppelin.apache.org/docs/0.8.0/setup/deployment/spark_cluster_mode.html#4-configure-spark-interpreter-in-zeppelin-1) add the two environment variables. 2. add --hostname flag to the command mesos-master. This solves the problem of "Failed to connect to xx.xx.xx.xx:5050"(screenshot). ### What type of PR is - Bug Fix - Improvement ### What is the Jira issue? * [ZEPPELIN-3932](https://issues.apache.org/jira/browse/ZEPPELIN-3932) ### How should this be tested? * Follow the instructions here: [spark-on-mesos-mode](https://zeppelin.apache.org/docs/0.8.0/setup/deployment/spark_cluster_mode.html#spark-on-mesos-mode) ### Screenshots (if appropriate) ![image](https://user-images.githubusercontent.com/18047300/50578759-fa513080-0e78-11e9-8459-3a2aa5881a2c.png) * update mesos Dockerfile 1. SPARK_VERSION 2.1.2 is not available anymore 2. centos cannot find libevent2-devel, but libevent-devel 3. update to centos7 and jdk8 * update entrypoint.sh according to tutorial 1. add environment variables MASTER and MESOS_NATIVE_JAVA_LIBRARY(from https://zeppelin.apache.org/docs/0.8.0/setup/deployment/spark_cluster_mode.html#4-configure-spark-interpreter-in-zeppelin-1) 2. add --hostname after mesos-master to solve the problem of "Failed to connect to xx.xx.xx.xx:5050" --- .../spark-cluster-managers/spark_mesos/Dockerfile | 26 +++++++++++----------- .../spark_mesos/entrypoint.sh | 4 +++- 2 files changed, 16 insertions(+), 14 deletions(-) diff --git a/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile b/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile index 0afda57..0eb26ca 100644 --- a/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile +++ b/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile @@ -12,10 +12,10 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. -FROM centos:centos6 +FROM centos:centos7 -ENV SPARK_PROFILE 2.1 -ENV SPARK_VERSION 2.1.2 +ENV SPARK_PROFILE 2.4 +ENV SPARK_VERSION 2.4.0 ENV HADOOP_PROFILE 2.7 ENV HADOOP_VERSION 2.7.0 @@ -29,15 +29,15 @@ tar \ curl \ svn \ cyrus-sasl-md5 \ -libevent2-devel \ +libevent-devel \ && \ yum clean all # Remove old jdk RUN yum remove java; yum remove jdk -# install jdk7 -RUN yum install -y java-1.7.0-openjdk-devel +# install jdk8 +RUN yum install -y java-1.8.0-openjdk-devel ENV JAVA_HOME /usr/lib/jvm/java ENV PATH $PATH:$JAVA_HOME/bin @@ -45,14 +45,9 @@ ENV PATH $PATH:$JAVA_HOME/bin RUN curl -s http://www.apache.org/dist/spark/spark-$SPARK_VERSION/spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE.tgz | tar -xz -C /usr/local/ RUN cd /usr/local && ln -s spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE spark -# update boot script -COPY entrypoint.sh /etc/entrypoint.sh -RUN chown root.root /etc/entrypoint.sh -RUN chmod 700 /etc/entrypoint.sh - # install mesos -RUN wget http://repos.mesosphere.com/el/6/x86_64/RPMS/mesos-1.0.0-2.0.89.centos65.x86_64.rpm -RUN rpm -Uvh mesos-1.0.0-2.0.89.centos65.x86_64.rpm +RUN wget http://repos.mesosphere.com/el/7/x86_64/RPMS/mesos-1.7.0-2.0.1.el7.x86_64.rpm +RUN rpm -Uvh mesos-1.7.0-2.0.1.el7.x86_64.rpm #spark EXPOSE 8080 7077 7072 8081 8082 @@ -60,4 +55,9 @@ EXPOSE 8080 7077 7072 8081 8082 #mesos EXPOSE 5050 5051 +# update boot script +COPY entrypoint.sh /etc/entrypoint.sh +RUN chown root.root /etc/entrypoint.sh +RUN chmod 700 /etc/entrypoint.sh + ENTRYPOINT ["/etc/entrypoint.sh"] diff --git a/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh b/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh index 28d76bf..2f9572b 100755 --- a/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh +++ b/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh @@ -20,6 +20,8 @@ export SPARK_MASTER_WEBUI_PORT=8080 export SPARK_WORKER_PORT=8888 export SPARK_WORKER_WEBUI_PORT=8081 export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64/server/ +export MASTER=mesos://127.0.1.1:5050 +export MESOS_NATIVE_JAVA_LIBRARY=/usr/lib/libmesos.so # spark configuration cp $SPARK_HOME/conf/spark-env.sh.template $SPARK_HOME/conf/spark-env.sh @@ -35,7 +37,7 @@ cd $SPARK_HOME/sbin ./start-slave.sh spark://`hostname`:$SPARK_MASTER_PORT # start mesos -mesos-master --ip=0.0.0.0 --work_dir=/var/lib/mesos &> /var/log/mesos_master.log & +mesos-master --ip=0.0.0.0 --work_dir=/var/lib/mesos --hostname=`hostname -I | cut -d' ' -f1` &> /var/log/mesos_master.log & mesos-slave --master=0.0.0.0:5050 --work_dir=/var/lib/mesos --launcher=posix &> /var/log/mesos_slave.log & CMD=${1:-"exit 0"}