I followed the documentation 
http://spark.incubator.apache.org/docs/latest/running-on-yarn.html  , 
 launch spark app with yarn-client mode.

this is my command:

SPARK_JAR=/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar
 
\

SPARK_YARN_APP_JAR=examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar
 
\

./run-example org.apache.spark.examples.SparkPi yarn-client



then I got an error:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/root/spark/spark-0.8.0-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/root/spark/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/24 15:39:28 INFO Slf4jEventHandler: Slf4jEventHandler started
13/12/24 15:39:29 INFO SparkEnv: Registering BlockManagerMaster
13/12/24 15:39:29 INFO MemoryStore: MemoryStore started with capacity 
1256.9 MB.
13/12/24 15:39:29 INFO DiskStore: Created local directory at 
/tmp/spark-local-20131224153929-fec4
13/12/24 15:39:29 INFO ConnectionManager: Bound socket to port 42147 with 
id = ConnectionManagerId(kmHadoop3,42147)
13/12/24 15:39:29 INFO BlockManagerMaster: Trying to register BlockManager
13/12/24 15:39:29 INFO BlockManagerMaster: Registered BlockManager
13/12/24 15:39:30 INFO HttpBroadcast: Broadcast server started at 
http://111.111.11.11:54314
13/12/24 15:39:30 INFO SparkEnv: Registering MapOutputTracker
13/12/24 15:39:30 INFO HttpFileServer: HTTP File server directory is 
/tmp/spark-4bfef083-f05a-485c-8e1d-fa7518eb7116
13/12/24 15:39:32 INFO SparkUI: Started Spark Web UI at 
http://testSpark:4040
13/12/24 15:39:40 INFO SparkContext: Added JAR 
/root/spark/spark-0.8.0-incubating/examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar
 
at 
http://111.111.11.11:38654/jars/spark-examples-assembly-0.8.0-incubating.jar 
with timestamp 1387870780009
13/12/24 15:39:40 WARN SparkContext: Master yarn-client does not match 
expected format, parsing as Mesos URL
Failed to load native Mesos library from 
/usr/java/jdk1.6.0_31/jre/lib/amd64/server:/usr/java/jdk1.6.0_31/jre/lib/amd64:/usr/java/jdk1.6.0_31/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in 
java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1028)
        at 
org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:52)
        at 
org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:64)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:216)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)


look at the red line of the log above: Master yarn-client does not match 
expected format

Why yarn-client does not match expected format?  I mean that is exactly 
what the documentation suggests ,right?

My spark version is 0.8.0, and used hadoop with -hadoop2.0.0-cdh4.3.0.jar, 
as the log shows above.

Any help or reply is very appriciated !  Thanks very much




-- 
You received this message because you are subscribed to the Google Groups 
"Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to