In Spark 0.9 and master, you can pass the -i argument to spark-shell to load a 
script containing commands before opening the prompt. This is also a feature of 
the Scala shell as a whole (try scala -help for details).

Also, once you’re in the shell, you can use :load file.scala to execute the 
content of file.scala as if you’d typed it into the shell.

Matei

On Feb 25, 2014, at 11:44 PM, Sampo Niskanen <sampo.niska...@wellmo.com> wrote:

> Hi,
> 
> I'd like to create a custom version of the Spark shell, which has 
> automatically defined some other variables / RDDs (in addition to 'sc') 
> specific to our application.  Is this possible?
> 
> I took a look at the code that the spark-shell invokes, and it seems quite 
> complex.  Can this be reused from my code?
> 
> 
> I'm implementing a standalone application that uses the Spark libraries 
> (managed by SBT).  Ideally, I'd like to be able to launch the shell from that 
> application, instead of using the default Spark distribution.  Alternatively, 
> can some utility code be injected within the standard spark-shell?
> 
> 
> Thanks.
> 
>     Sampo Niskanen
>     Lead developer / Wellmo
> 
> 

Reply via email to