The latter one sound to me like been built by mvn?

Best Regards,
Raymond Liu

From: [email protected] [mailto:[email protected]] 
Sent: Thursday, December 12, 2013 2:02 PM
To: user
Subject: Re: Re: I need some help

What will be cleaned if I compile Spark with sbt/sbt clean assembly?
Actually I find there is a problem in my product's url 
sparkhome/assembly/target/scala-2.9.3 , there are two jars named 
spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.2.1.jar  and 
spark-assembly_2.9.3-0.8.0-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar , and the 
compute-classpath.sh make the classpath with ASSEMBLY_JAR=`ls 
"$FWDIR"/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*.jar`  , so 
the two jars splited with a space in the classpath .
Somebody encounter the same problem ?
 
________________________________________
[email protected]
 
From: Paco Nathan
Date: 2013-12-12 04:25
To: user
Subject: Re: I need some help
did you try using: 

   sbt/sbt clean assembly

On Tue, Dec 10, 2013 at 10:23 PM, [email protected] <[email protected]> 
wrote:
I  have deployed two Spark clusters .
The first is a simple standalone cluster which is working well . ( sbt/sbt 
assembly )
But I built Spark against Hadoop2.0.0-cdh4.2.1 in the second cluster, there  
seems to be a problem when I start the master ! ( 
SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1  sbt/sbt assembly  )
 
[lh1@ocnosql84 bin]$ start-master.sh
starting org.apache.spark.deploy.master.Master, logging to 
/home/lh1/spark_hadoopapp/spark-0.8.0-incubating-bin-cdh4/bin/../logs/spark-lh1-org.apache.spark.deploy.master.Master-1-ocnosql84.out
failed to launch org.apache.spark.deploy.master.Master:
  [Loaded java.lang.Shutdown from /home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
  [Loaded java.lang.Shutdown$Lock from /home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
full log in 
/home/lh1/spark_hadoopapp/spark-0.8.0-incubating-bin-cdh4/bin/../logs/spark-lh1-org.apache.spark.deploy.master.Master-1-ocnosql84.out
 
It seems that I failed to load the Master , but the class 
org.apache.spark.deploy.master.Master exists 
spark-0.8.0-incubating-bin-cdh4/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.2.1.jar
 . I set the parameter SPARK_DAEMON_JAVA_OPTS = -verbose:class , there are some 
 logs :
 
Spark Command: /home/lh1/app/jdk1.7.0/bin/java -cp 
:/home/lh1/spark_hadoopapp/spark-0.8.0-incubating-bin-cdh4/conf:/home/lh1/spark_hadoopapp/spark-0.8.0-incubating-bin-cdh4/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.0-cdh4.2.1.jar
/home/lh1/spark_hadoopapp/spark-0.8.0-incubating-bin-cdh4/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar
 -verbose:class -Djava.library.path= -Xms512m -Xmx512m 
org.apache.spark.deploy.master.Master --ip ocnosql84 --port 7077 --webui-port 
8080
========================================
 
......
 
[Loaded java.text.Format$Field from /home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
[Loaded java.text.MessageFormat$Field from 
/home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
Error: Could not find or load main class org.apache.spark.deploy.master.Master
[Loaded java.lang.Shutdown from /home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
[Loaded java.lang.Shutdown$Lock from /home/lh1/app/jdk1.7.0/jre/lib/rt.jar]
 
Thanks !
 
 
 
________________________________________
[email protected]

Reply via email to