Sorry my bad, didn't notice replies go back to the group.
On Mon, Mar 7, 2022, 9:25 AM David Luu wrote:
> Hi Justin,
>
> I am interested in the course.
>
> Regards,
> David
>
> On Mon, Mar 7, 2022, 7:11 AM Justin Mclean wrote:
>
>>
>> Hi,
>>
>
Hi Justin,
I am interested in the course.
Regards,
David
On Mon, Mar 7, 2022, 7:11 AM Justin Mclean wrote:
>
> Hi,
>
> I’m from OpenSI, a not for profit open source foundation. We have a
> limited number of free spots for a 6 week online Apache Kafka course
> starting on the 22nd of March. It
Can someone clarify the usage and requirements of consumer ID when
used/specified?
I came across this topic and brought that up in a comment in the thread,
someone else suggested it being worthy of another post
Does this have a github source repo for the docker image/file setup? or
just only available for the final docker image in docker repo?
On Sun, Nov 26, 2017 at 11:53 AM, Christian F. Gonzalez Di Antonio <
christian...@gmail.com> wrote:
> I would like to share my @apachekafka
; Thank you in advance for any information.
>
> With Regards
> Tomasz Rojek
> Java Engineer
>
>
>
--
David Luu
Member of Technical Staff
Mist Systems, Inc.
1601 S. De Anza Blvd. #248
Cupertino, CA 95014
What happens when run it as windows service? And you save/dump the consumed
data somewhere other than console (for windows service)?
On Thu, Nov 3, 2016 at 1:11 AM, Birendra Kumar Singh
wrote:
> Hi
>
> I need help in creating a windows service to consume from kafka topic.
Hello,
This is my first time trying kafka in Java. I was following this tutorial:
https://cwiki.apache.org/confluence/display/KAFKA/0.8.0+Producer+Example
for the sample code & maven dependencies. I tweaked the code a little for
my needs and run the code in a Thread-based class with the main
What's the best way to consume messages from a starting offset to an ending
offset? It seems the low level consumer (API),
kafka-simple-consumer-shell.sh, and kafkacat offer an option to consume
from a starting offset, but there's no option to tell it to stop consuming
after a given offset. One
Scenario:
A kafka + zookeeper setup with topics that span partitions. Obviously high
level consumer is desirable here.
But let's say we have a need to consume by offset, and we don't want the
earliest offset as that has extra messages we'd have to filter out, and
from latest offset doesn't have
For microservices, there seems to be suggestions to test by contract and
pact, for pact like this:
https://github.com/realestate-com-au/pact
but these generally seem to be targeted to REST, and the implementations
are limited to few languages so far.
I was wondering if anyone has adapted those
If one wanted to check response time for a consumer fetching a topic
message (on the client side), similar to checking an HTTP request's
response time for the web, what's the best approach to take?
I notice the kafka shell scripts if used for that have some startup
overhead if used to assess
I was wondering, do the kafka consumer shell scripts (high and low level
ones) and kafkacat do any pre-processing of the topic messages before
outputting to stdout or does it just output "as is" in the format the
message originally came in through kafka from the producer?
Meaning pretty printed
Hi,
Similar to consuming max messages N (or count of messages N in kafkacat),
is there any way/option to consume starting from offset A up to ending
offset B (we know only offsets but not the total # of messages between
those offsets)
It looks like that's not available natively? And I guess one
* Alerting * Anomaly Detection * Centralized Log Management
> Solr & Elasticsearch Support * http://sematext.com/
>
>
> On Sun, Sep 20, 2015 at 1:14 AM, David Luu <manga...@gmail.com> wrote:
>
> > I'd like to generate load against a system we have that uses kafka
I'd like to generate load against a system we have that uses kafka as the
message bus. We have a custom JSON message format, and to properly load
test the system, each set of messages for a particular scenario (i.e. user)
needs to have a unique identifier, which it normally does.
I think of using
The toy project idea is good. Another option I think could be to look at
the various Kafka client langague bindings and/or utilities (like
kafkacat). And from there, another option is to build a client language
binding for a language that's kind of lacking Kafka support, some have
better support
I got your email from the list?
On Fri, Aug 21, 2015 at 1:56 PM, Rajiv Kurian ra...@signalfuse.com wrote:
Wondering why my emails to the mailing list don't go through.
--
David Luu
Member of Technical Staff
Mist Systems, Inc.
1601 S. De Anza Blvd. #248
Cupertino, CA 95014
Hi,
I notice the kafka-console-consumer.sh script has option to fetch a max #
of messages, which can be 1 or more, and then exit. Which is nice. But as a
high level consumer, it's missing option to fetch from given offset other
than earliest latest offsets.
Is there any off the shelf tool (CLI,
18 matches
Mail list logo