I'm actually surprised your memory is that high. Spark only allocates
spark.storage.memoryFraction for storing RDDs.  This defaults to .6, so 32
GB * .6 * 10 executors should be a total of 192 GB.

-Sandy

On Sat, Sep 20, 2014 at 8:21 AM, Soumya Simanta <soumya.sima...@gmail.com>
wrote:

> There 128 cores on each box. Yes there are other applications running on
> the cluster. YARN is assigning two containers to my application. I'll
> investigate this a little more. PS: I'm new to YARN.
>
>
>
> On Fri, Sep 19, 2014 at 4:49 PM, Vipul Pandey <vipan...@gmail.com> wrote:
>
>> How many cores do you have in your boxes?
>> looks like you are assigning 32 cores "per" executor - is that what you
>> want?  are there other applications running on the cluster? you might want
>> to check YARN UI to see how many containers are getting allocated to your
>> application.
>>
>>
>> On Sep 19, 2014, at 1:37 PM, Soumya Simanta <soumya.sima...@gmail.com>
>> wrote:
>>
>> I'm launching a Spark shell with the following parameters
>>
>> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory
>> 4g --executor-cores 32 --num-executors 8
>>
>> but when I look at the Spark UI it shows only 209.3 GB total memory.
>>
>>
>> Executors (10)
>>
>>    - *Memory:* 55.9 GB Used (209.3 GB Total)
>>
>> This is a 10 node YARN cluster where each node has 48G of memory.
>>
>> Any idea what I'm missing here?
>>
>> Thanks
>> -Soumya
>>
>>
>>
>>
>>
>

Reply via email to