Here is my updated spark submit without any luck.,

spark-submit --master yarn --deploy-mode cluster --files
/appl/common/ftp/conf.json,/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
--num-executors 6 --executor-cores 3 --driver-cores 3 --driver-memory 7g
--executor-memory 7g /appl/common/ftp/ftp_event_data.py
/appl/common/ftp/conf.json 2021-05-10 7

On Fri, May 14, 2021 at 6:19 PM KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:

> Sorry my bad, it did not resolve the issue. I still have the same issue.
> can anyone please guide me. I was still running as a client instead of a
> cluster.
>
> On Fri, May 14, 2021 at 5:05 PM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> You are right. It worked but I still don't understand why I need to pass
>> that to all executors.
>>
>> On Fri, May 14, 2021 at 5:03 PM KhajaAsmath Mohammed <
>> mdkhajaasm...@gmail.com> wrote:
>>
>>> I am using json only to read properties before calling spark session. I
>>> don't know why we need to pass that to all executors.
>>>
>>>
>>> On Fri, May 14, 2021 at 5:01 PM Longjiang.Yang <
>>> longjiang.y...@target.com> wrote:
>>>
>>>> Could you check whether this file is accessible in executors? (is it in
>>>> HDFS or in the client local FS)
>>>> /appl/common/ftp/conf.json
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *From: *KhajaAsmath Mohammed <mdkhajaasm...@gmail.com>
>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>> *To: *"user @spark" <user@spark.apache.org>
>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>
>>>>
>>>>
>>>> /appl/common/ftp/conf.json
>>>>
>>>

Reply via email to