Hi,

I've set up a stand-alone Spark cluster and I want several users to be able
interact with our data at once using Spark.  I make heavy use of the
spark-shell REPL, but if another user starts spark-shell, it won't run any
RDD actions until the other user exits spark-shell (which is a problem if
they forget to exit when they're done for example).  Is there a way around
this?  I've read something SharkServer, but I'm not sure that solves my
problem, because I don't know if one instance of shark-shell will block all
the others.

Thanks,

Ryan Prenger

Reply via email to