I have found a source how to compile spark codes and dynamically load them
into distributed executors in spark repl:
https://ardoris.wordpress.com/2014/03/30/how-spark-does-class-loading/
If you run spark repl, you can find the spark configuration like this :
Hi,
I have a plan to program such functions like in spark-shell.
When spark-shell is run for yarn, it seems that spark-shell application is
submitted to yarn with yarn cluster client mode.
I am curious when the input codes in scala are typed in spark-shell, how the
input codes in scala are