Fellow Sparkers,
I'm rather puzzled at the submitJob API. I can't quite figure out how it is
supposed to be used. Is there any more documentation about it?
Also, is there any simpler way to multiplex jobs on the cluster, such as
starting multiple computations in as many threads in the driver and
Hi Alex,
SparkContext.submitJob() is marked as experimental -- most client programs
shouldn't be using it. What are you looking to do?
For multiplexing jobs, one thing you can do is have multiple threads in
your client JVM each submit jobs on your SparkContext job. This is
described here in
Andrew,
Thanks, yes, this is what I wanted: basically just to start multiple jobs
concurrently in threads.
Alex
On Mon, Dec 22, 2014 at 4:04 PM, Andrew Ash and...@andrewash.com wrote:
Hi Alex,
SparkContext.submitJob() is marked as experimental -- most client programs
shouldn't be using it.
A SparkContext is thread safe, so you can just have different threads
that create their own RDD's and do actions, etc.
- Patrick
On Mon, Dec 22, 2014 at 4:15 PM, Alessandro Baretta
alexbare...@gmail.com wrote:
Andrew,
Thanks, yes, this is what I wanted: basically just to start multiple jobs