Re: Spark dynamic allocation with special executor configuration

2019-02-25 Thread Abdeali Kothari
Yes, it will. In general: Spark should spawn as many executors as it can to eat up all the resources on a node. On Tue, Feb 26, 2019, 11:59 Anton Puzanov Hello everyone, > > Spark has a dynamic resource allocation scheme, where, when available > Spark manager will automatically add executors to

Spark dynamic allocation with special executor configuration

2019-02-25 Thread Anton Puzanov
Hello everyone, Spark has a dynamic resource allocation scheme, where, when available Spark manager will automatically add executors to the application resource. Spark's default configuration is for executors to allocate the entire worker node they are running on, but this is configurable, my

Re: Spark 2.4 partitions and tasks

2019-02-25 Thread Pedro Tuero
Good question. What I have read about is that Spark is not a magician and can't know how many tasks will be better for your input, so it can fail. Spark set the default parallelism as twice the number of cores on the cluster. In my jobs, it seemed that using the parallelism inherited from input