I think you need spark-1.2 profile and hadoop-2.4 profile.

Please try
mvn install -DskipTests -Pspark-1.2 -Dspark.version=1.2.1 -Phadoop-2.4
-Dhadoop.version=2.5.0

Thanks,
moon

On Fri, May 1, 2015 at 10:22 AM Sambit Tripathy (RBEI/EDS1) <
[email protected]> wrote:

>  Moon,
>
>
>
> This is what I have in my configuration
>
>
>
> export
> ZEPPELIN_INTERPRETERS=org.apache.zeppelin.spark.SparkInterpreter,org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkSqlInterpreter,org.apache.zeppelin.spark.DepInterpreter,org.apache.zeppelin.markdown.Markdown,org.apache.zeppelin.shell.ShellInterpreter,org.apache.zeppelin.hive.HiveInterpreter
>
> export ZEPPELIN_INTERPRETER_DIR=/home/sambit/incubator-zeppelin/interpreter
>
> export ZEPPELIN_PORT=8901
>
> export HADOOP_CONF_DIR=/usr/lib/hadoop/etc/hadoop
>
> export
> SPARK_YARN_JAR=/usr/lib/spark/lib/spark-assembly-1.2.0-cdh5.3.0-hadoop2.5.0-cdh5.3.0.jar
>
> export ZEPPELIN_NOTEBOOK_DIR=/home/sambit/zep-notebook-dir   # Where
> notebook saved
>
>
>
>
>
> Used this command
>
>
>
> mvn install -DskipTests -Dspark.version=1.2.1 -Dhadoop.version=2.5.0
>
> to build Zeppelin as provided in the website
>
>
>
> That’s all.
>
>
>
> Should the –Dhadoop.version change to 2.5.0-cdh5.3.0?
>
>
>
> Regards,
>
> Sambit.
>
>
>
>
>
> *From:* moon soo Lee [mailto:[email protected]]
> *Sent:* Thursday, April 30, 2015 5:25 PM
> *To:* [email protected]
> *Subject:* Re: Scheduler already terminated error
>
>
>
> Hi,
>
>
>
> That error message can be shown when Zeppelin fails to create
> SparkContext. Could you check Zeppelin configuration for your yarn cluster?
> How did you setup Zeppelin for your Yarn cluster?
>
>
>
> Like Zeppelin build command against your spark / hadoop version, Zeppelin
> Interpreter setting, hadoop/yarn configuration files.
>
>
>
> Thanks,
>
> moon
>
>
>
> On Fri, May 1, 2015 at 8:02 AM Sambit Tripathy (RBEI/EDS1) <
> [email protected]> wrote:
>
> Hi,
>
>
>
> After installation, I tried to run this simple spark command and got this
> error. Any idea what it could be?
>
>
>
> Command: %spark val ctx = new org.apache.spark.sql.SqlContext(sc)
>
>
>
> Error:
>
>
>
>
>
>
>
> Scheduler already terminated
>
> org.apache.zeppelin.scheduler.RemoteScheduler.submit(RemoteScheduler.java:122)
> org.apache.zeppelin.notebook.Note.run(Note.java:271)
> org.apache.zeppelin.socket.NotebookServer.runParagraph(NotebookServer.java:531)
> org.apache.zeppelin.socket.NotebookServer.onMessage(NotebookServer.java:119)
> org.java_websocket.server.WebSocketServer.onWebsocketMessage(WebSocketServer.java:469)
> org.java_websocket.WebSocketImpl.decodeFrames(WebSocketImpl.java:368)
> org.java_websocket.WebSocketImpl.decode(WebSocketImpl.java:157)
> org.java_websocket.server.WebSocketServer$WebSocketWorker.run(WebSocketServer.java:657)
>
>
>
> ERROR
>
> What is the best way to verify that Spark Interpreter is working
> correctly? Is this a Yarn error?
>
>
>
> PS: I am using yarn.
>
>
>
>
>
>
>
> Regards,
>
> Sambit.
>
>
>
>
>
>
>

Reply via email to