Hi Roger, You should be able to use the --jars argument of spark-shell to add JARs onto the classpath and then work with those classes in the shell. (A recent patch, https://github.com/apache/spark/pull/542, made spark-shell use the same command-line arguments as spark-submit). But this is a great question, we should test it out and see whether anything else would make development easier.
SBT also has an interactive shell where you can run classes in your project, but unfortunately Spark can’t deal with closures typed directly in that the right way. However you write your Spark logic in a method and just call that method from the SBT shell, that should work. Matei On Apr 27, 2014, at 3:14 PM, Roger Hoover <roger.hoo...@gmail.com> wrote: > Hi, > > From the meetup talk about the 1.0 release, I saw that spark-submit will be > the preferred way to launch apps going forward. > > How do you recommend launching such jobs in a development cycle? For > example, how can I load an app that's expecting to a given to spark-submit > into spark-shell? > > Also, can anyone recommend other tricks for rapid development? I'm new to > Scala, sbt, etc. I think sbt can watch for changes in source files and > compile them automatically. > > I want to be able to make code changes and quickly get into a spark-shell to > play around with them. > > I appreciate any advice. Thanks, > > Roger