Have you built Zeppelin with against the version of Hadoop & Spark you are
using?  It has to be build with the appropriate versions as this will pull
in the required libraries from Hadoop and Spark.  By default Zeppelin will
not work on Yarn with out doing the build.

@deepujain posted a fairly comprehensive guide on the forum to follow with
the steps to take to deploy under Hadoop and Yarn.  It was posted yesterday,
August 4th.

HTH.

-Todd


On Wed, Aug 5, 2015 at 2:35 AM, manya cancerian <manyacancer...@gmail.com>
wrote:

> hi Guys,
>
> I am trying to run Zeppelin using Yarn as resource manager. I have made
> following changes
>
> 1- I have specified master as 'yarn-client' in the interpreter settings
> using UI
> 2. I have specified HADOOP_CONF_DIR as conf directory containing hadoop
> configuration files
>
> In my scenario I have three machines.
> a- Client Machine where zeppelin is installed
> b- Machine where YARN resource manager is running along with hadoop
> cluster namenode and datanode
> c- Machine running data node
>
>
> When I submit job from my client machine , it gets submitted to yarn but
> fails with following exception -
>
>
> 5/08/04 15:08:05 ERROR yarn.ApplicationMaster: Uncaught exception:
> org.apache.spark.SparkException: Failed to connect to driver!
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:424)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:284)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:146)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:575)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:573)
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:596)
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
> 15/08/04 15:08:05 INFO yarn.ApplicationMaster: Final app status: FAILED, 
> exitCode: 10, (reason: Uncaught exception: Failed to connect to driver!)
>
>
>
> Any help is much appreciated!
>
>
> Regards
>
> Manya
>
>

Reply via email to