Re: Num of executors and cores

2016-07-26 Thread Mail.com
>>> >>>> On Tue, Jul 26, 2016 at 2:18 AM, Mail.com <pradeep.mi...@mail.com> wrote: >>>> Hi All, >>>> >>>> I have a directory which has 12 files. I want to read the entire file so I >>>> am reading it as wholeTextFiles(dirpat

Re: Num of executors and cores

2016-07-26 Thread Jacek Laskowski
com> wrote: >>> Hi All, >>> >>> I have a directory which has 12 files. I want to read the entire file so I >>> am reading it as wholeTextFiles(dirpath, numPartitions). >>> >>> I run spark-submit as --num-executors 12 --executor-cores &

Re: Num of executors and cores

2016-07-26 Thread Mail.com
have a directory which has 12 files. I want to read the entire file so I >> am reading it as wholeTextFiles(dirpath, numPartitions). >> >> I run spark-submit as --num-executors 12 --executor-cores >> 1 and numPartitions 12. >> >> However, when I run the jo

Re: Num of executors and cores

2016-07-26 Thread Jacek Laskowski
il.com> wrote: > Hi All, > > I have a directory which has 12 files. I want to read the entire file so I am > reading it as wholeTextFiles(dirpath, numPartitions). > > I run spark-submit as --num-executors 12 --executor-cores 1 > and numPartitions 12. > > However, wh

Num of executors and cores

2016-07-25 Thread Mail.com
Hi All, I have a directory which has 12 files. I want to read the entire file so I am reading it as wholeTextFiles(dirpath, numPartitions). I run spark-submit as --num-executors 12 --executor-cores 1 and numPartitions 12. However, when I run the job I see that the stage which reads

Re: Executors and Cores

2016-05-16 Thread Mich Talebzadeh
http://talebzadehmich.wordpress.com On 15 May 2016 at 13:19, Mail.com <pradeep.mi...@mail.com> wrote: > Hi , > > I have seen multiple videos on spark tuning which shows how to determine # > cores, #executors and memory size of the job. > > In all that I have seen, it seems each job has to be giv

Re: Executors and Cores

2016-05-15 Thread Mail.com
8Pw > > http://talebzadehmich.wordpress.com > > >> On 15 May 2016 at 13:19, Mail.com <pradeep.mi...@mail.com> wrote: >> Hi , >> >> I have seen multiple videos on spark tuning which shows how to determine # >> cores, #executors and memory size of

Re: Executors and Cores

2016-05-15 Thread Jacek Laskowski
On Sun, May 15, 2016 at 8:19 AM, Mail.com wrote: > In all that I have seen, it seems each job has to be given the max resources > allowed in the cluster. Hi, I'm fairly sure it was because FIFO scheduling mode was used. You could change it to FAIR and make some

Re: Executors and Cores

2016-05-15 Thread Mich Talebzadeh
8Pw>* http://talebzadehmich.wordpress.com On 15 May 2016 at 13:19, Mail.com <pradeep.mi...@mail.com> wrote: > Hi , > > I have seen multiple videos on spark tuning which shows how to determine # > cores, #executors and memory size of the job. > > In all that I ha

Re: Executors and Cores

2016-05-15 Thread Ted Yu
For the last question, have you looked at: https://spark.apache.org/docs/latest/configuration.html#dynamic-allocation FYI On Sun, May 15, 2016 at 5:19 AM, Mail.com <pradeep.mi...@mail.com> wrote: > Hi , > > I have seen multiple videos on spark tuning which shows how to dete

Executors and Cores

2016-05-15 Thread Mail.com
Hi , I have seen multiple videos on spark tuning which shows how to determine # cores, #executors and memory size of the job. In all that I have seen, it seems each job has to be given the max resources allowed in the cluster. How do we factor in input size as well? I am processing a 1gb