Hi,

I'd like to create a custom version of the Spark shell, which has
automatically defined some other variables / RDDs (in addition to 'sc')
specific to our application.  Is this possible?

I took a look at the code that the spark-shell invokes, and it seems quite
complex.  Can this be reused from my code?


I'm implementing a standalone application that uses the Spark libraries
(managed by SBT).  Ideally, I'd like to be able to launch the shell from
that application, instead of using the default Spark distribution.
 Alternatively, can some utility code be injected within the standard
spark-shell?


Thanks.

*    Sampo Niskanen*

*Lead developer / Wellmo*

Reply via email to