You can look at the spark.streaming.concurrentJobs by default it runs a
single job. If set it to 2 then it can run 2 jobs parallely. Its an
experimental flag, but go ahead and give it a try.
On Aug 21, 2015 3:36 AM, "Sateesh Kavuri" <sateesh.kav...@gmail.com> wrote:

> Hi,
>
> My scenario goes like this:
> I have an algorithm running in Spark streaming mode on a 4 core virtual
> machine. Majority of the time, the algorithm does disk I/O and database
> I/O. Question is, during the I/O, where the CPU is not considerably loaded,
> is it possible to run any other task/thread so as to efficiently utilize
> the CPU?
>
> Note that one DStream of the algorithm runs completely on a single CPU
>
> Thank you,
> Sateesh
>

Reply via email to