Hi,
I am running a Kafka POC with below details
* 3 Node cluster (4 Core, 16 GB RAM each) running Kafka 0.8.2.1.
* Each node running Kafka & Zookeeper instances. (So total of 3 Kafka
brokers & 3 zookeepers)
When I tried to create a topic using the kafka-topics.sh, observing the
ties ?
Regards,
Prabhjot
On Aug 6, 2015 1:43 PM, "Hemanth Abbina" wrote:
> Hi,
>
> I am running a Kafka POC with below details
>
> * 3 Node cluster (4 Core, 16 GB RAM each) running Kafka 0.8.2.1.
>
> * Each node running Kafka & Zookeeper instance
hine by
> creating a file named myid, one for each server, which resides in that
> server's data directory, as specified by the configuration file parameter
> *dataDir*.
>
> On Fri, Aug 7, 2015 at 5:59 PM, Hemanth Abbina
>
> wrote:
>
> > Yes. I have set
Hi,
Our application receives events through a HAProxy server on HTTPs, which should
be forwarded and stored to Kafka cluster.
What should be the best option for this ?
This layer should receive events from HAProxy & produce them to Kafka cluster,
in a reliable and efficient way (and should scal
Although now that I'm looking at it more, it looks like they're working on a
MySQL storage engine?
Anyway yeah, I'd love some discussion on this, or war stories of migration to
Kafka from other event systems (F/OSS or...bespoke).
On Wed, Aug 26, 2015 at 3:45 PM, Hemanth Abbina
w
at solution you end up with if each of
the requests to HAProxy only contains one message).
-Ewen
On Wed, Aug 26, 2015 at 5:05 PM, Hemanth Abbina
wrote:
> Marc,
>
> Thanks for your response. Let's have more details on the problem.
>
> As I already mentioned in the
th a lower request
rate the messages will still be sent to the broker immediately.
-Ewen
On Wed, Aug 26, 2015 at 9:31 PM, Hemanth Abbina
wrote:
> Ewen,
>
> Thanks for the explanation.
>
> We have control over the logs format coming to HAProxy. Right now,
> these are plain
I'm new to Flume and thinking to use Flume in the below scenario.
Our system receives events as HTTP POST, and we need to store them in Kafka(for
processing) as well as HDFS(as permanent store).
Can we configure Flume as below ?
* Source: HTTP (expecting JSON event as HTTP body, with a