you may have to recreate your cluster with below configuration at emr
creation
    "Configurations": [
            {
                "Properties": {
                    "maximizeResourceAllocation": "false"
                },
                "Classification": "spark"
            }
        ]

On Fri, Dec 29, 2017 at 11:57 PM, Jeroen Miller <bluedasya...@gmail.com>
wrote:

> On 28 Dec 2017, at 19:25, Patrick Alwell <palw...@hortonworks.com> wrote:
> > Dynamic allocation is great; but sometimes I’ve found explicitly setting
> the num executors, cores per executor, and memory per executor to be a
> better alternative.
>
> No difference with spark.dynamicAllocation.enabled set to false.
>
> JM
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to