Yes more specifically, you can't ask for executors once the app starts,
in SparkConf like that. You set this when you launch it against a Spark
cluster in spark-submit or otherwise.

On Tue, Mar 21, 2023 at 4:23 AM Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi Emmanouil,
>
> This means that your job is running on the driver as a single JVM, hence
> active(1)
>
>

Reply via email to