Hi, Does a python spark program (which makes use of pyspark ) submitted in cluster mode need python on the executor nodes ? Isn't the python program interpreted on the client node from where the job is submitted and then the executors run in the JVM of each the executor nodes ?
Thank you, Ranjana