Hi,

What's a typical work flow of spark application development in scala?

One option is to write a scala application with a main function, and keep
executing the app after every development change. Given a big overhead of a
moderately sized development data, this could mean slow iterations.

Another option is to somehow initialize the data in REPL, and keep the
development inside REPL. This would mean faster development iterations,
however, it's not clear to me how to keep the code in sync with REPL. Do
you just copy/paste the code into REPL, or is it possible to compile the
code into jar, and keep reloading the jar in REPL?

Any other ways of doing this?

Reply via email to