Hi
I am currently using akka http sending requests to multiple spark actors
that use a preloaded spark context and fair scheduler. It's only a
prototype and I haven't tested the concurrency but it seems one of the
rigth way to do. Complete processing time is arround 600 ms.The other way
would be to use a spark job server but i don't like to split my REST API in
2 (one business  in akka http and one technical in jobserver).

Le 2 nov. 2016 8:34 AM, "Fanjin Zeng" <fj.z...@yahoo.com.invalid> a écrit :

>  Hi,
>
>  I am working on a project that takes requests from HTTP server and
> computes accordingly on spark. But the problem is when I receive many
> request at the same time, users have to waste a lot of time on the
> unnecessary startups that occur on each request. Does Spark have built-in
> job scheduler function to solve this problem or is there any trick can be
> used to avoid these unnecessary startups?
>
>  Thanks,
>  Fanjin
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to