Not sure if this will help you or if you've already tried it. But, maybe
setting the log levels to debug will give you more information.

Hope this helps.

-Cesar


On Tue, Dec 10, 2013 at 8:40 PM, Umar Javed <[email protected]> wrote:

> any help regarding this?...thx
>
>
> On Tue, Nov 19, 2013 at 6:13 PM, Umar Javed <[email protected]> wrote:
>
>> I have a scala script that I'm trying to run on a Spark standalone
>> cluster with just one worker (existing on the master node). But the
>> application just hangs. Here's the worker log output at the time of
>> starting the job:
>>
>> 3/11/19 18:03:13 INFO Worker: Asked to launch executor
>> app-20131119180313-0001/0 for job
>> 13/11/19 18:03:13 INFO ExecutorRunner: Launch command: "java" "-cp"
>> ":/homes/network/revtr/ujaved/incubator-spark/conf:/homes/network/revtr/ujaved/incubator-spark/assembly/target/scala\
>> -2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar"
>> "-Xms512M" "-Xmx512M" "org.apache.spark.executor.StandaloneExecutorBackend"
>> "akka://[email protected]:57653\
>> /user/StandaloneScheduler" "0" "drone.cs.washington.edu" "16"
>> "app-20131119180313-0001"
>> 13/11/19 18:03:13 INFO Worker: Asked to kill executor
>> app-20131119180313-0001/0
>> 13/11/19 18:03:13 INFO ExecutorRunner: Killing process!
>> 13/11/19 18:03:13 INFO Worker: Executor app-20131119180313-0001/0
>> finished with state KILLED
>> 13/11/19 18:03:13 INFO ExecutorRunner: Redirection to
>> /homes/network/revtr/ujaved/incubator-spark/work/app-20131119180313-0001/0/stdout
>> closed: Stream closed
>> 13/11/19 18:03:13 INFO ExecutorRunner: Redirection to
>> /homes/network/revtr/ujaved/incubator-spark/work/app-20131119180313-0001/0/stderr
>> closed: Bad file descriptor
>> 13/11/19 18:03:13 ERROR Worker: key not found: app-20131119180313-0001/0
>>
>>
>> Why is the worker killed as soon as it is started? I should mention I
>> don't have this problem when using pyspark.
>>
>> thanks!
>> Umar
>>
>
>

Reply via email to