Yes, there are kafka consumers/producers for almost all the languages, you
can read more over here
https://cwiki.apache.org/confluence/display/KAFKA/Clients#Clients-PHP
Here's a repo for the php version https://github.com/EVODelavega/phpkafka

Thanks
Best Regards

On Sun, Oct 18, 2015 at 12:58 PM, <tarek.abouzei...@yahoo.com> wrote:

> hi Akhlis
>
> its a must to push data to a socket as i am using php as a web service to
> push data to socket , then spark catch the data on that socket and process
> it , is there a way to push data from php to kafka directly ?
>
> --  Best Regards, -- Tarek Abouzeid
>
>
>
> On Sunday, October 18, 2015 10:26 AM, "tarek.abouzei...@yahoo.com" <
> tarek.abouzei...@yahoo.com> wrote:
>
>
> hi Xiao,
> 1- requests are not similar at all , but they use solr and do commit
> sometimes
> 2- no caching is required
> 3- the throughput must be very high yeah , the requests are tiny but the
> system may receive 100 request/sec ,
> does kafka support listening to a socket ?
>
> --  Best Regards, -- Tarek Abouzeid
>
>
>
> On Monday, October 12, 2015 10:50 AM, Xiao Li <gatorsm...@gmail.com>
> wrote:
>
>
> Hi, Tarek,
>
> It is hard to answer your question. Are these requests similar? Caching
> your results or intermediate results in your applications? Or does that
> mean your throughput requirement is very high? Throttling the number of
> concurrent requests? ...
>
> As Akhil said, Kafka might help in your case. Otherwise, you need to read
> the designs or even source codes of Kafka and Spark Streaming.
>
>  Best wishes,
>
> Xiao Li
>
>
> 2015-10-11 23:19 GMT-07:00 Akhil Das <ak...@sigmoidanalytics.com>:
>
> Instead of pushing your requests to the socket, why don't you push them to
> a Kafka or any other message queue and use spark streaming to process them?
>
> Thanks
> Best Regards
>
> On Mon, Oct 5, 2015 at 6:46 PM, <tarek.abouzei...@yahoo.com.invalid>
> wrote:
>
> Hi ,
> i am using Scala , doing a socket program to catch multiple requests at
> same time and then call a function which uses spark to handle each process
> , i have a multi-threaded server to handle the multiple requests and pass
> each to spark , but there's a bottleneck as the spark doesn't initialize a
> sub task for the new request , is it even possible to do parallel
> processing using single spark job ?
> Best Regards,
>
> --  Best Regards, -- Tarek Abouzeid
>
>
>
>
>
>
>
>

Reply via email to