RE: used cores are less then total no. of core

2015-02-25 Thread Somnath Pandeya
Thanks Akhil , it was a simple fix which you told .. I missed it .. ☺ From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Wednesday, February 25, 2015 12:48 PM To: Somnath Pandeya Cc: user@spark.apache.org Subject: Re: used cores are less then total no. of core You can set the following in

Re: used cores are less then total no. of core

2015-02-24 Thread Akhil Das
You can set the following in the conf while creating the SparkContext (if you are not using spark-submit) .set("spark.cores.max", "32") Thanks Best Regards On Wed, Feb 25, 2015 at 11:52 AM, Somnath Pandeya < somnath_pand...@infosys.com> wrote: > Hi All, > > > > I am running a simple word co

Re: used cores are less then total no. of core

2015-02-24 Thread VISHNU SUBRAMANIAN
Try adding --total-executor-cores 5 , where 5 is the number of cores. Thanks, Vishnu On Wed, Feb 25, 2015 at 11:52 AM, Somnath Pandeya < somnath_pand...@infosys.com> wrote: > Hi All, > > > > I am running a simple word count example of spark (standalone cluster) , > In the UI it is showing > > F

used cores are less then total no. of core

2015-02-24 Thread Somnath Pandeya
Hi All, I am running a simple word count example of spark (standalone cluster) , In the UI it is showing For each worker no. of cores available are 32 ,but while running the jobs only 5 cores are being used, What should I do to increase no. of used core or it is selected based on jobs. Thanks