I have met the same problem. found a solution?

I tried to run spark-pi, but the executors are all killed.

./bin/run-example org.apache.spark.examples.SparkPi spark://cse-node1:7077

There's no error message in stderr and stdout other than the java command.




On Thu, Dec 12, 2013 at 3:05 PM, Aaron Davidson <[email protected]> wrote:

> You might also check the spark/work/ directory for application (Executor)
> logs on the slaves.
>
>
> On Tue, Nov 19, 2013 at 6:13 PM, Umar Javed <[email protected]> wrote:
>
>> I have a scala script that I'm trying to run on a Spark standalone
>> cluster with just one worker (existing on the master node). But the
>> application just hangs. Here's the worker log output at the time of
>> starting the job:
>>
>> 3/11/19 18:03:13 INFO Worker: Asked to launch executor
>> app-20131119180313-0001/0 for job
>> 13/11/19 18:03:13 INFO ExecutorRunner: Launch command: "java" "-cp"
>> ":/homes/network/revtr/ujaved/incubator-spark/conf:/homes/network/revtr/ujaved/incubator-spark/assembly/target/scala\
>> -2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar"
>> "-Xms512M" "-Xmx512M" "org.apache.spark.executor.StandaloneExecutorBackend"
>> "akka://[email protected]:57653\
>> /user/StandaloneScheduler" "0" "drone.cs.washington.edu" "16"
>> "app-20131119180313-0001"
>> 13/11/19 18:03:13 INFO Worker: Asked to kill executor
>> app-20131119180313-0001/0
>> 13/11/19 18:03:13 INFO ExecutorRunner: Killing process!
>> 13/11/19 18:03:13 INFO Worker: Executor app-20131119180313-0001/0
>> finished with state KILLED
>> 13/11/19 18:03:13 INFO ExecutorRunner: Redirection to
>> /homes/network/revtr/ujaved/incubator-spark/work/app-20131119180313-0001/0/stdout
>> closed: Stream closed
>> 13/11/19 18:03:13 INFO ExecutorRunner: Redirection to
>> /homes/network/revtr/ujaved/incubator-spark/work/app-20131119180313-0001/0/stderr
>> closed: Bad file descriptor
>> 13/11/19 18:03:13 ERROR Worker: key not found: app-20131119180313-0001/0
>>
>>
>> Why is the worker killed as soon as it is started? I should mention I
>> don't have this problem when using pyspark.
>>
>> thanks!
>> Umar
>>
>
>


-- 
Dachuan Huang
Cellphone: 614-390-7234
2015 Neil Avenue
Ohio State University
Columbus, Ohio
U.S.A.
43210

Reply via email to