Hi,

once again lets start with the requirement. Why are you trying to pass xml
and json files to SPARK instead of reading them in SPARK?
Generally when people pass on files they are python or jar files.

Regards,
Gourav

On Sat, May 15, 2021 at 5:03 AM Amit Joshi <mailtojoshia...@gmail.com>
wrote:

> Hi KhajaAsmath,
>
> Client vs Cluster: In client mode driver runs in the machine from where
> you submit your job. Whereas in cluster mode driver runs in one of the
> worker nodes.
>
> I think you need to pass the conf file to your driver, as you are using it
> in the driver code, which runs in one of the worker nodes.
> Use this command to pass it to driver
> *--files  /appl/common/ftp/conf.json  --conf
> spark.driver.extraJavaOptions="-Dconfig.file=conf.json*
>
> And make sure you are able to access the file location from worker nodes.
>
>
> Regards
> Amit Joshi
>
> On Sat, May 15, 2021 at 5:14 AM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> Here is my updated spark submit without any luck.,
>>
>> spark-submit --master yarn --deploy-mode cluster --files
>> /appl/common/ftp/conf.json,/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
>> --num-executors 6 --executor-cores 3 --driver-cores 3 --driver-memory 7g
>> --executor-memory 7g /appl/common/ftp/ftp_event_data.py
>> /appl/common/ftp/conf.json 2021-05-10 7
>>
>> On Fri, May 14, 2021 at 6:19 PM KhajaAsmath Mohammed <
>> mdkhajaasm...@gmail.com> wrote:
>>
>>> Sorry my bad, it did not resolve the issue. I still have the same issue.
>>> can anyone please guide me. I was still running as a client instead of a
>>> cluster.
>>>
>>> On Fri, May 14, 2021 at 5:05 PM KhajaAsmath Mohammed <
>>> mdkhajaasm...@gmail.com> wrote:
>>>
>>>> You are right. It worked but I still don't understand why I need to
>>>> pass that to all executors.
>>>>
>>>> On Fri, May 14, 2021 at 5:03 PM KhajaAsmath Mohammed <
>>>> mdkhajaasm...@gmail.com> wrote:
>>>>
>>>>> I am using json only to read properties before calling spark session.
>>>>> I don't know why we need to pass that to all executors.
>>>>>
>>>>>
>>>>> On Fri, May 14, 2021 at 5:01 PM Longjiang.Yang <
>>>>> longjiang.y...@target.com> wrote:
>>>>>
>>>>>> Could you check whether this file is accessible in executors? (is it
>>>>>> in HDFS or in the client local FS)
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From: *KhajaAsmath Mohammed <mdkhajaasm...@gmail.com>
>>>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>>>> *To: *"user @spark" <user@spark.apache.org>
>>>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>>>
>>>>>>
>>>>>>
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>

Reply via email to