Hi

On Mon, Oct 7, 2019 at 19:20 Amit Sharma <resolve...@gmail.com> wrote:

> Thanks Andrew but I am asking specific to driver memory not about
> executors memory. We have just one master and if each jobs driver.memory=4g
> and master nodes total memory is 16gb then we can not execute more than 4
> jobs at a time.


I understand that. I think there's a misunderstanding with the terminology,
though. Are you running multiple separate spark instances on a single
machine or one instance with multiple jobs inside.


>
> On Monday, October 7, 2019, Andrew Melo <andrew.m...@gmail.com> wrote:
>
>> Hi Amit
>>
>> On Mon, Oct 7, 2019 at 18:33 Amit Sharma <resolve...@gmail.com> wrote:
>>
>>> Can you please help me understand this. I believe driver programs runs
>>> on master node
>>
>> If we are running 4 spark job and driver memory config is 4g then total
>>> 16 6b would be used of master node.
>>
>>
>> This depends on what master/deploy mode you're using: if it's "local"
>> master and "client mode" then yes tasks execute in the same JVM as the
>> driver. In this case though, the driver JVM uses whatever much space is
>> allocated for the driver regardless of how many threads you have.
>>
>>
>> So if we will run more jobs then we need more memory on master. Please
>>> correct me if I am wrong.
>>>
>>
>> This depends on your application, but in general more threads will
>> require more memory.
>>
>>
>>
>>>
>>> Thanks
>>> Amit
>>>
>> --
>> It's dark in this basement.
>>
> --
It's dark in this basement.

Reply via email to