Matthias,
The instances are transient (Mesos) so when we roll them we get a brand new
instance.
-russ
On Wed, Feb 7, 2018 at 4:53 PM, Matthias J. Sax
wrote:
> Russel,
>
> > Yes. We used the reset tool before deploying with 1.1.
>
> Did you also clean up local state? The
Russel,
> Yes. We used the reset tool before deploying with 1.1.
Did you also clean up local state? The tool only takes care of "broker
side" cleanup. You would need to delete local state by calling
KafkaStreams#cleanup() before restart or by deleting the corresponding
local state directory
Russell,
INFO level is fine and it could be just the portion of the logs right after
streams has finished rebalancing.
You can tar them up and attach to this mailing list unless you'd prefer not
to do so, in which case I can send you my email address directly.
Thanks,
Bill
On Wed, Feb 7, 2018
Matthias,
Yes. We used the reset tool before deploying with 1.1.
-russ
On Wed, Feb 7, 2018 at 4:23 PM, Matthias J. Sax
wrote:
> Did you start the app from scratch, ie, wipe out all state before you
> restarted with 1.1? If not, reusing existing stores would overrule a
>
Did you start the app from scratch, ie, wipe out all state before you
restarted with 1.1? If not, reusing existing stores would overrule a
more balanced deployment.
You can set a new application.id or better use the reset tool to reset
the application completely (maybe just calling
Matthias,
Disregard the exception I mentioned. I think that was a transient error
caused by our broker cluster re-spinning.
-russ
On Wed, Feb 7, 2018 at 3:45 PM, Russell Teabeault
wrote:
> Hi Matthias,
>
> Thanks for the prompt reply. We have built the kafka-streams
Bill,
I may be able to.
- What logging level?
- Do you need logs from all the instances?
- Where should I send them?
-russ
On Wed, Feb 7, 2018 at 4:12 PM, Bill Bejeck wrote:
> Russell,
>
> Can you share any log files?
>
> Thanks,
> Bill
>
>
>
> On Wed, Feb 7, 2018 at 5:45
Russell,
Can you share any log files?
Thanks,
Bill
On Wed, Feb 7, 2018 at 5:45 PM, Russell Teabeault <
rteabea...@twitter.com.invalid> wrote:
> Hi Matthias,
>
> Thanks for the prompt reply. We have built the kafka-streams jar from the
> 1.1 branch and deployed our instances. We are only able
Hi Matthias,
Thanks for the prompt reply. We have built the kafka-streams jar from the
1.1 branch and deployed our instances. We are only able to upgrade the
Kafka Streams to 1.1
and can not upgrade to 1.1 for the brokers. I don't think that should
matter though. Yes?
It does not seem to have
+1.
While validating the web docs I realize that there are a bunch of broken
javadoc links. I have filed https://github.com/apache/kafka/pull/4543 and
plan to cherry-pick to 1.0/1.1. But I think this do not need to block the
voting: we just need to copy-paste the web docs again after it is
Is it possible for a consumer to automatically begin reading the latest
messages (rather than from the previously read offset) after a rebalance
has occurred with the brokers?
This could be something similar to the "auto.offset.reset" setting or if
there's some way to catch the
Kafka users and developers,
The next *Bay Area Apache Kafka Meetup* is on the *morning of Feb 20* and
is hosted by *Index Developer Conference* at Moscone West in San Francisco
.
Meetup Info: https://www.meetup.com/KafkaBayArea/events/247433783/
Registration Link: https://ibm.co/2n742Jn
Hi Guys,
Anyone has ever run into the following issue or give me suggestion to do
addressing, thanks.
2018-02-07 18:59:59,783 [myid:9] - INFO [NIOServerCxn.Factory:0.0.0.0/
0.0.0.0:2181:NIOServerCnxn@1040] - Closed socket connection for client /
10.92.74.216:27897 (no session established for
It's a know issue and we addressed it already via
https://issues.apache.org/jira/browse/KAFKA-4969
The fix will be part of upcoming 1.1 release, but you could try it out
immediately running from trunk or 1.0 branch. (If you do, feedback would
be very welcome :))
Your proposed workarounds should
We are using Kafka Streams for a project and had some questions about how
stream tasks are assigned.
streamBuilder
.stream("inbound-topic", Consumed.`with`(keySerde, valueSerde))
... // Do some stuff here
.through("intermediate-topic")
... // Do some other stuff here
In this example we
Hi Ewen,
+1
Building from source and running the quickstart were successful on Ubuntu
and Windows 10.
Thanks for running the release.
--Vahid
From: Ewen Cheslack-Postava
To: d...@kafka.apache.org, users@kafka.apache.org,
kafka-clie...@googlegroups.com
Date:
Thanks a lot Guozhang. I was able to nail it down by looking at the log
which you suggested. The log revealed that it was trying to connect to
localhost and it was a problem with one of my sub component. It was trying
to read the broker configuration from a different property file which
didn't
17 matches
Mail list logo