Yes. this file is available in this path in the same machine where i'm
running the spark. later i moved spark-1.4.1 folder to all other machines
in my cluster but still i'm facing the same issue.


*Thanks*,
<https://in.linkedin.com/in/ramkumarcs31>


On Thu, Aug 13, 2015 at 1:17 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Just make sure this file is available:
>
> appattempt_1437639737006_3808_000002 exited with  exitCode: -1000 due to:
> File *file:/home/hdfs/spark-1.4.1/python/lib/pyspark.zip* does not exist
>
> Thanks
> Best Regards
>
> On Thu, Aug 13, 2015 at 12:20 PM, Ramkumar V <ramkumar.c...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I have a cluster of 1 master and 2 slaves. I'm running a spark streaming
>> in master and I want to utilize all nodes in my cluster. i had specified
>> some parameters like driver memory and executor memory in my code. when i
>> give --deploy-mode cluster --master yarn-cluster in my spark-submit, it
>> gives the following error.
>>
>> Log link : *http://pastebin.com/kfyVWDGR <http://pastebin.com/kfyVWDGR>*
>>
>> How to fix this issue ? Please help me if i'm doing wrong.
>>
>>
>> *Thanks*,
>> Ramkumar V
>>
>>
>

Reply via email to