Hello list!

I am trying to familiarize with Apache Spark. I  would like to ask something 
about partitioning and executors. 

Can I have e.g: 500 partitions but launch only one executor that will run 
operations in only 1 partition of the 500? And then I would like my job to die. 

Is there any easy way? Or i have to modify code to achieve that?

Thank you,

To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to