Thanks Hans - this makes sense, except for the debug messages give me
exactly what I need without having to instrument any clients. It should be
noted that for now, I am running a single server, so perhaps the messages
change when I cluster?
I maybe caused confusion by mentioning that I want to
I can tell from the terminology you use that you are familiar with traditional
message queue products. Kafka is very different. Thats what makes it so
interesting and revolutionary in my opinion.
Clients do not connect to topics because kafka is a distributed and clustered
system where topics
Hi All - just started to use Kafka. Just one thing driving me nuts. I want
to get logs of each time a publisher or subscriber connects. I am trying to
just get the IP that they connected from and the topic to which they
connected. I have managed to do this through enabling debug in the
I think literature on confluent/ASF and also the community support here is
best to learn about streaming.
On Mon, Jan 13, 2020 at 6:47 PM M. Manna wrote:
> Hey Sachin,
>
> On Mon, 13 Jan 2020 at 05:12, Sachin Mittal wrote:
>
> > Hi,
> > The way I have used streams processing in past; use case
Hey Sachin,
On Mon, 13 Jan 2020 at 05:12, Sachin Mittal wrote:
> Hi,
> The way I have used streams processing in past; use case to process streams
> is when you have a continuous stream of data which needs to be processed
> and used by certain applications.
> Since in kafka streams can be a
Hi,
The way I have used streams processing in past; use case to process streams
is when you have a continuous stream of data which needs to be processed
and used by certain applications.
Since in kafka streams can be a simple java application, this application
can run in its own JVM which is
Hello,
Even though I have been using Kafka for a while, it's primarily for
publish/subscribe event messaging ( and I understand them reasonably well).
But I would like to do more regarding streams.
For my initiative, I have been going through the code written in "examples"
folder. I would like
We are using both and leaning towards a web service fronting Kafka because it
gives us the ability to centralize other logic. That said, I don't think the
webservice will be much more "stable" and you'll need to consider what to do
with your audit records if the web service call fails.
-Dave
I don’t feel it would be a big hit in performance because Kafka works very
fast. I think the speed difference would be negligible. Why are you worried
about stability? I’m just curious because it doesn’t seem like it would be
unstable, but maybe it would be a bit overkill for one app and some
Hi All,
We have a Spring based web app.
We are planning to build an 'Audit Tracking' feature and plan to use Kafka
- as a sink for storing Audit messages (which will then be consumed and
persisted to a common DB).
We are planning to build a simple, ‘pass-through’ REST service which will
take a
Hi
It seems that your keytab doesn't have the principal you configured your
"client" section to use. Post your jaas here if you want further help but
basically you should be able to do
kinit -V -k -t
On 18 Feb. 2017 3:56 am, "Raghav" wrote:
Hi
I
Hi
I am trying to setup a simple setup with one Kafka broker, and zookeeper on
the same VM. One producer and one consumer on each VM. I have setup a KDC
on cents VM.
I am trying to following this guide:
http://docs.confluent.io/2.0.0/kafka/sasl.html#kerberos
When I start Kafka, it errors out
I tried both the approaches stated above, with no luck :(.
Let me give concrete examples of what i am trying to achieve here :
1) Kafka Producer adds multiple JSON messages to a particular topic in the
message broker (done, this part works)
2) I want to have multiple consumers identified under a
Hi Predeep,
I think I misinterpreted your question. Are you trying to consume a topic
multiple times for each consumer instance or consume one topic with
multiple consumer instances?
In case that you want to consume a topic multiple times with one consumer
instance, `seekToBeginning`` will reset
Pradeep,
How about
https://kafka.apache.org/090/javadoc/org/apache/kafka/clients/consumer/KafkaConsumer.html#seekToBeginning%28org.apache.kafka.common.TopicPartition...%29
-Harsha
On Sat, Apr 9, 2016, at 09:48 PM, Pradeep Bhattiprolu wrote:
> Liquan , thanks for the
Liquan , thanks for the response.
By setting the auto commit to false do i have to manage queue offset
manually ?
I am running a multiple threads with each thread being a consumer, it would
be complicated to manage offsets across threads, if i dont use kafka's
automatic consumer group abstraction.
Hi Pradeep,
Can you try to set enable.auto.commit = false if you want to read to the
earliest offset? According to the documentation, auto.offset.reset controls
what to do when there is no initial offset in Kafka or if the current
offset does not exist any more on the server (e.g. because that
Hi All
I am a newbie to kafka. I am using the new Consumer API in a thread acting
as a consumer for a topic in Kafka.
For my testing and other purposes I have read the queue multiple times
using console-consumer.sh script of kafka.
To start reading the message from the beginning in my java code
18 matches
Mail list logo