Please find the attached pom.xml. I am using maven to build the fat jar and
trying to run it in yarn using

*hadoop jar simple-yarn-app-master/target/simple-yarn-app-1.1.0-shaded.jar
com.hortonworks.simpleyarnapp.Client
hdfs://quickstart.cloudera:8020/simple-yarn-app-1.1.0-shaded.jar*

Basically I am following the below code and changed the Application Master
to run a Spark application class.

https://github.com/hortonworks/simple-yarn-app

It works for 1.3 the installed version in cdh but throws error for 1.4.
When I am bundling the spark within the jar it shouldn't be the case right ?



On Wed, Oct 21, 2015 at 5:11 PM, Adrian Tanase <atan...@adobe.com> wrote:

> The question is the spark dependency is marked as provided or is included
> in the fat jar.
>
> For example, we are compiling the spark distro separately for java 8 +
> scala 2.11 + hadoop 2.6 (with maven) and marking it as provided in sbt.
>
> -adrian
>
> From: Raghuveer Chanda
> Date: Wednesday, October 21, 2015 at 2:14 PM
> To: Jean-Baptiste Onofré
> Cc: "user@spark.apache.org"
> Subject: Re: Spark on Yarn
>
> Hi,
>
> So does this mean I can't run spark 1.4 fat jar on yarn without installing
> spark 1.4.
>
> I am including spark 1.4 in my pom.xml so doesn't this mean its compiling
> in 1.4.
>
>
> On Wed, Oct 21, 2015 at 4:38 PM, Jean-Baptiste Onofré <j...@nanthrax.net>
> wrote:
>
>> Hi
>>
>> The compiled version (master side) and client version diverge on spark
>> network JavaUtils. You should use the same/aligned version.
>>
>> Regards
>> JB
>>
>>
>>
>> Sent from my Samsung device
>>
>>
>> -------- Original message --------
>> From: Raghuveer Chanda <raghuveer.cha...@gmail.com>
>> Date: 21/10/2015 12:33 (GMT+01:00)
>> To: user@spark.apache.org
>> Subject: Spark on Yarn
>>
>> Hi all,
>>
>> I am trying to run spark on yarn in quickstart cloudera vm.It already
>> has spark 1.3 and Hadoop 2.6.0-cdh5.4.0 installed.(I am not using
>> spark-submit since I want to run a different version of spark).
>>
>> I am able to run spark 1.3 on yarn but get the below error for spark 1.4.
>>
>> The log shows its running on spark 1.4 but still gives a error on a
>> method which is present in 1.4 and not 1.3. Even the fat jar contains the
>> class files of 1.4.
>>
>> As far as running in yarn the installed spark version shouldnt matter,
>> but still its running on the other version.
>>
>>
>> *Hadoop Version:*
>> Hadoop 2.6.0-cdh5.4.0
>> Subversion http://github.com/cloudera/hadoop -r
>> c788a14a5de9ecd968d1e2666e8765c5f018c271
>> Compiled by jenkins on 2015-04-21T19:18Z
>> Compiled with protoc 2.5.0
>> From source with checksum cd78f139c66c13ab5cee96e15a629025
>> This command was run using
>> /usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.0.jar
>>
>> *Error:*
>> LogType:stderr
>> Log Upload Time:Tue Oct 20 21:58:56 -0700 2015
>> LogLength:2334
>> Log Contents:
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/10/simple-yarn-app-1.1.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> 15/10/20 21:58:50 INFO spark.SparkContext: *Running Spark version 1.4.0*
>> 15/10/20 21:58:53 INFO spark.SecurityManager: Changing view acls to: yarn
>> 15/10/20 21:58:53 INFO spark.SecurityManager: Changing modify acls to:
>> yarn
>> 15/10/20 21:58:53 INFO spark.SecurityManager: SecurityManager:
>> authentication disabled; ui acls disabled; users with view permissions:
>> Set(yarn); users with modify permissions: Set(yarn)
>> *Exception in thread "main" java.lang.NoSuchMethodError:
>> org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J*
>> at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
>> at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>> at
>> org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
>> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
>> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
>> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
>> at
>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>> at com.hortonworks.simpleyarnapp.HelloWorld.main(HelloWorld.java:50)
>> 15/10/20 21:58:53 INFO util.Utils: Shutdown hook called
>>
>> Please help :)
>>
>> --
>> Regards and Thanks,
>> Raghuveer Chanda
>>
>
>
>
> --
> Regards,
> Raghuveer Chanda
> Computer Science and Engineering
> IIT Kharagpur
> +91-9475470374
>



-- 
Regards,
Raghuveer Chanda
Computer Science and Engineering
IIT Kharagpur
+91-9475470374
<project xmlns="http://maven.apache.org/POM/4.0.0"; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd";>
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.hortonworks</groupId>
  <artifactId>simple-yarn-app</artifactId>
  <version>1.1.0</version>
  <name>simple-yarn-app</name>
 
  <dependencies>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-yarn-client</artifactId>
      <version>2.2.0</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.2.0</version>
    </dependency>
    
    <dependency>
	<groupId>org.apache.spark</groupId>
	<artifactId>spark-yarn_2.10</artifactId>
	<version>1.4.0</version>
</dependency>
    
	<dependency>
		<groupId>org.apache.spark</groupId>
		<artifactId>spark-core_2.10</artifactId>
		<version>1.4.0</version>
	</dependency>
  </dependencies>


<build>
      <plugins>
      
          <plugin>
              <groupId>org.apache.maven.plugins</groupId>
              <artifactId>maven-shade-plugin</artifactId>
              <version>2.3</version>
              <executions>
                  <execution>
                      <phase>package</phase>
                      <goals>
                          <goal>shade</goal>
                      </goals>
                      <configuration>
                      <filters>
					        <filter>
					            <artifact>*:*</artifact>
					            <excludes>
					                <exclude>META-INF/*.SF</exclude>
					                <exclude>META-INF/*.DSA</exclude>
					                <exclude>META-INF/*.RSA</exclude>
					            </excludes>
					        </filter>
  					</filters>
                          <transformers>
                              <transformer
                                      implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                  <resource>reference.conf</resource>
                              </transformer>
                              <transformer
                                      implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                              </transformer>
                          </transformers>
                      </configuration>
                  </execution>
              </executions>
          </plugin>
      </plugins>
  </build>
  
  
  </project>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to