Re: Job failed: Implementing class

2016-12-07 Thread Jeff Zhang
Can you try to build it with spark 1.6 and scala-2.10 ?

Facundo Bianco 于2016年12月8日周四 上午8:32写道:

> Jianfeng (Jeff) Zhang wrote:
> > Do you use zeppelin binary distribution or build it by yourself ?
>
> I used the binary distribution with all interpreters.
>
>
> --
> Facundo Bianco
>


Re: Job failed: Implementing class

2016-12-07 Thread Jianfeng (Jeff) Zhang

Do you use zeppelin binary distribution or build it by yourself ?


Best Regard,
Jeff Zhang


From: Facundo Bianco <facu...@grandata.com<mailto:facu...@grandata.com>>
Reply-To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Date: Thursday, December 8, 2016 at 7:33 AM
To: "users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>" 
<users@zeppelin.apache.org<mailto:users@zeppelin.apache.org>>
Subject: Job failed: Implementing class

Hi there,

On HDP 2.4 I've installed Zeppelin 0.6.1 with Spark interpreter built with 
Scala 2.10. (Spark version is 1.6.1.)

All interpreters work well but the Spark interpreter fails. The error in log 
message is:

> ERROR [2016-12-07 15:57:40,512] ({pool-2-thread-2} Job.java[run]:189) - Job 
> failed
> java.lang.IncompatibleClassChangeError: Implementing class

(All error stack trace is here: 
https://gist.github.com/vando/50bd0dbb970d0c2bd2fe13a6344109b8.)

In zeppelin-env.sh file the environment variables are

> export MASTER=yarn-client
> export HADOOP_CONF_DIR="/etc/hadoop/conf"
> export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.2.0-258 
> -Dspark.yarn.queue=default"
> export SPARK_HOME="/usr/hdp/current/spark-client"
> export 
> PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip"
> export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"

Do you have any idea on how to correct this error?

Thanks in advance.

Best,


Job failed: Implementing class

2016-12-07 Thread Facundo Bianco
Hi there,

On HDP 2.4 I've installed Zeppelin 0.6.1 with Spark interpreter built with
Scala 2.10. (Spark version is 1.6.1.)

All interpreters work well but the Spark interpreter fails. The error in
log message is:

> ERROR [2016-12-07 15:57:40,512] ({pool-2-thread-2} Job.java[run]:189) -
Job failed
> java.lang.IncompatibleClassChangeError: Implementing class

(All error stack trace is here:
https://gist.github.com/vando/50bd0dbb970d0c2bd2fe13a6344109b8.)

In zeppelin-env.sh file the environment variables are

> export MASTER=yarn-client
> export HADOOP_CONF_DIR="/etc/hadoop/conf"
> export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.2.0-258
-Dspark.yarn.queue=default"
> export SPARK_HOME="/usr/hdp/current/spark-client"
> export
PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip"
> export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"

Do you have any idea on how to correct this error?

Thanks in advance.

Best,