I found the issue which was due to my app looking for wrong spark jar.

Thanks,
Hussam

From: Tathagata Das [mailto:[email protected]]
Sent: Monday, January 20, 2014 6:17 PM
To: [email protected]
Subject: Re: Error: Could not find or load main class 
org.apache.spark.executor.CoarseGrainedExecutorBackend

Hi Hussam,

Have you (1) generated Spark jar using sbt/sbt assembl, (2) distributed the 
Spark jar to the worker machines? It could be that the system expects that 
Spark jar to be present in 
/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar
 in one of the worker machines, but its not finding the jar and hence not 
finding the necessary class. Can you double-check whether the jar exists in 
that location in all the worker nodes?

TD

On Mon, Jan 20, 2014 at 4:44 PM, 
<[email protected]<mailto:[email protected]>> wrote:
Hi,

I am using spark 0.8.0 when hadoop 1.2.1 on Standalone cluster mode with 3 
worker nodes and 1 master.

Can someone help me on this error I am getting when running my app in a spark 
cluster ?
Error: Could not find or load main class 
org.apache.spark.executor.CoarseGrainedExecutorBackend

Command on the worker node is

Spark Executor Command: "java" "-cp" 
":/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar"
 "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" 
"-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" 
"-Xms49152M" "-Xmx49152M" 
"org.apache.spark.executor.CoarseGrainedExecutorBackend" 
"akka://spark@poc1:54483/user/CoarseGrainedScheduler" "2" "poc3" "16"

I checked logs on spark master as well spark workers but not much info except 
above error.

Thanks,
Hussam

Reply via email to