Re: Databricks - number of executors, shuffle.partitions etc
Thanks Ayan, I wasn't aware of such user group specifically for databricks. Thanks for the input, much appreciated! On Wed, May 15, 2019 at 10:07 PM ayan guha wrote: > Well its a databricks question so better be asked in their forum. > > You can set up cluster level params when you create new cluster or add > them later. Go to cluster page, ipen one cluster, expand additional config > section and add your param there as key value pair separated by space. > > On Thu, 16 May 2019 at 11:46 am, Rishi Shah > wrote: > >> Hi All, >> >> Any idea? >> >> Thanks, >> -Rishi >> >> On Tue, May 14, 2019 at 11:52 PM Rishi Shah >> wrote: >> >>> Hi All, >>> >>> How can we set spark conf parameter in databricks notebook? My cluster >>> doesn't take into account any spark.conf.set properties... it creates 8 >>> worker nodes (dat executors) but doesn't honor the supplied conf >>> parameters. Any idea? >>> >>> -- >>> Regards, >>> >>> Rishi Shah >>> >> >> >> -- >> Regards, >> >> Rishi Shah >> > -- > Best Regards, > Ayan Guha > -- Regards, Rishi Shah
Re: Databricks - number of executors, shuffle.partitions etc
Well its a databricks question so better be asked in their forum. You can set up cluster level params when you create new cluster or add them later. Go to cluster page, ipen one cluster, expand additional config section and add your param there as key value pair separated by space. On Thu, 16 May 2019 at 11:46 am, Rishi Shah wrote: > Hi All, > > Any idea? > > Thanks, > -Rishi > > On Tue, May 14, 2019 at 11:52 PM Rishi Shah > wrote: > >> Hi All, >> >> How can we set spark conf parameter in databricks notebook? My cluster >> doesn't take into account any spark.conf.set properties... it creates 8 >> worker nodes (dat executors) but doesn't honor the supplied conf >> parameters. Any idea? >> >> -- >> Regards, >> >> Rishi Shah >> > > > -- > Regards, > > Rishi Shah > -- Best Regards, Ayan Guha
Re: Databricks - number of executors, shuffle.partitions etc
Hi All, Any idea? Thanks, -Rishi On Tue, May 14, 2019 at 11:52 PM Rishi Shah wrote: > Hi All, > > How can we set spark conf parameter in databricks notebook? My cluster > doesn't take into account any spark.conf.set properties... it creates 8 > worker nodes (dat executors) but doesn't honor the supplied conf > parameters. Any idea? > > -- > Regards, > > Rishi Shah > -- Regards, Rishi Shah
Databricks - number of executors, shuffle.partitions etc
Hi All, How can we set spark conf parameter in databricks notebook? My cluster doesn't take into account any spark.conf.set properties... it creates 8 worker nodes (dat executors) but doesn't honor the supplied conf parameters. Any idea? -- Regards, Rishi Shah