Hi,

I have a plan to program such functions like in spark-shell.
When spark-shell is run for yarn, it seems that spark-shell application is
submitted to yarn with yarn cluster client mode.

I am curious when the input codes in scala are typed in spark-shell, how the
input codes in scala are compiled dynamically, and how these compiled codes
can be loaded onto the classloader in the distributed current yarn
executors.

Cheers,

- Kidong Lee








--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to