[ 
https://issues.apache.org/jira/browse/SPARK-10923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14943342#comment-14943342
 ] 

Tarek Abouzeid commented on SPARK-10923:
----------------------------------------

oh , sorry , thanks 

> Spark handling parallel requests 
> ---------------------------------
>
>                 Key: SPARK-10923
>                 URL: https://issues.apache.org/jira/browse/SPARK-10923
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 1.3.0
>         Environment: Cent os 6 
>            Reporter: Tarek Abouzeid
>            Priority: Trivial
>              Labels: parallel, scala, spark-submit
>
> Hi ,
> i am using Scala , doing a socket program to catch multiple requests at same 
> time and then call a function which uses spark to handle each process , i 
> have a multi-threaded server to handle the multiple requests and pass each to 
> spark , but there's a bottleneck as the spark doesn't initialize a sub task 
> for the new request , is it even possible to do parallel processing using 
> single spark job ?
> Best Regards, 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to