Re: Link read avro from Kafka Connect Issue

2016-11-02 Thread Tauzell, Dave
Is Kafka connect adding some bytes to the beginning of the avro with the scheme registry id? Dave > On Nov 2, 2016, at 18:43, Will Du wrote: > > By using the kafka-avro-console-consumer I am able to get rich message from > kafka connect with AvroConvert, but it got no

Re: consumer_offsets partition skew and possibly ignored retention

2016-11-02 Thread James Brown
Jeff: This was with 0.9.0.1. It has not recurred since upgrading to 0.10.1.0. On Fri, Oct 28, 2016 at 9:28 PM, Jeff Widman wrote: > James, > What version did you experience the problem with? > > On Oct 28, 2016 6:26 PM, "James Brown" wrote: > > > I was

Re: Link read avro from Kafka Connect Issue

2016-11-02 Thread Will Du
By using the kafka-avro-console-consumer I am able to get rich message from kafka connect with AvroConvert, but it got no output except schema from Flink By using the producer with defaultEncoding, the kafka-avro-console-consumer throws exceptions show how. But Flink consumer works. But my

Link read avro from Kafka Connect Issue

2016-11-02 Thread Will Du
On Nov 2, 2016, at 7:31 PM, Will Du wrote: Hi folks, I am trying to consume avro data from Kafka in Flink. The data is produced by Kafka connect using AvroConverter. I have created a AvroDeserializationSchema.java

Re: consumer_offsets partition skew and possibly ignored retention

2016-11-02 Thread Chi Hoang
I tried running reassignment on the topic, but that didn't help. I had to restart the broker for it to release the file handlers, then manually delete. On Fri, Oct 28, 2016 at 6:25 PM, James Brown wrote: > I was having this problem with one of my __consumer_offsets

Re: Kafka Streams fails permanently when used with an unstable network

2016-11-02 Thread Eno Thereska
Hi Sai, For your second note on rebalancing taking a long time, we have just improved the situation in trunk after fixing this JIRA: https://issues.apache.org/jira/browse/KAFKA-3559 . Feel free to give it a go if rebalancing time continues to

Re: Modify in-flight messages

2016-11-02 Thread Eno Thereska
Hi Dominik, Not sure if this is 100% relevant, but since I noticed you saying that you are benchmarking stream processing engines, one way to modify a message would be to use the Kafka Streams library, where you consume a message from a topic, modify it as needed/do some processing, and then

Re : windowing with the processor api

2016-11-02 Thread Hamza HACHANI
Thanks a lot. This was very helpful . Hamza - Message de réponse - De : "Eno Thereska" Pour : "users@kafka.apache.org" Objet : windowing with the processor api Date : mer., nov. 2, 2016 19:18 Thanks Matthias, yes, to get window

Kafka Streams Error

2016-11-02 Thread Furkan KAMACI
I use Kafka 0.10.0.1. I count the messages of a topic as follows: ... streamsConfiguration.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"); ... KStream longs = builder.stream(Serdes.String(), Serdes.String(), "qps-input"); ... KTable longCounts =

Added to Wiki please

2016-11-02 Thread Kenny Gorman
Per the wiki, I am emailing the list for this. Can you please add us to https://cwiki.apache.org/confluence/display/KAFKA/Powered+By? Eventador.io (https://www.eventador.io/) is a whole stack Kafka as-a-service company. We enable developers to quickly create and painlessly manage real-time

HDFS Connector Compression?

2016-11-02 Thread Henry Kim
Is it possible to add compression to the HDFS Connector out of the box? Or does it require code change? Thanks Henry Kim

Re: windowing with the processor api

2016-11-02 Thread Eno Thereska
Thanks Matthias, yes, to get window operations, or things like hopping or sliding windows you need to use the DSL (e.g., TimeWindows class). The Processor API is very basic (and thus flexible but) you'll end up re-implementing TimeWindows. Eno > On 2 Nov 2016, at 17:45, Matthias J. Sax

Re: windowing with the processor api

2016-11-02 Thread Matthias J. Sax
-BEGIN PGP SIGNED MESSAGE- Hash: SHA512 A windowed store does not work the way you expect it. The parameter "windowSize" is not a store parameter itself, but a caching parameter for the store (only used if caching get's enabled). For window support, Streams provide window semantics on

Re: Question regarding dynamic subscriber environment

2016-11-02 Thread Gerrit Jansen van Vuuren
yes. You open a connection, and the fetch threads will look at a shared variable for the topics to fetch, its this shared variable that is updated when you add and remove topics. The connection itself is not closed. There is no relation between a connection and the topics being consumed, other

Re: Question regarding dynamic subscriber environment

2016-11-02 Thread Janagan Sivagnanasundaram
Does this really address the respective problem? Ultimate task is that, the connection between broker and subscriber should not be terminated. Subscriber is free to change his topic interests without closing the connection. On Wed, Nov 2, 2016 at 12:43 PM, Gerrit Jansen van Vuuren <

Re: connection closed by kafka

2016-11-02 Thread Rajini Sivaram
Broker closes client connections that are idle for a configurable period of time (broker property connections.max.idle.ms). The default idle time is 10 minutes which matches the close time in the logs. On Wed, Nov 2, 2016 at 2:43 PM, Jaikiran Pai wrote: > Which exact

Re: connection closed by kafka

2016-11-02 Thread Jaikiran Pai
Which exact version of Kafka installation and Kafka client is this? And which language/library of Kafka client? Also, are you describing this situation in the context of producing messages? Can you post your relevant code from the application where you deal with this? Connection management is

log compaction

2016-11-02 Thread Francesco laTorre
Hi, We want to enable log compaction on an existing topic (in production). Is it a safe operation or there are things to take into consideration ? Kafka version 0.8 Cheers, Francesco -- Francesco laTorre Senior Developer T: +44 208 742 1600 +44 203 249 8394 E:

Re: 0.8.2.1 Client not able to connect with Kafka 0.10.0.1 cluster even the cluster has message format version 0.8.2.

2016-11-02 Thread Madhukar Bharti
Hi, After checking we found that there was an issue with version id passed in ConsumerMetadataRequest after setting it to 0(ConsumerMetadataRequest.currentVersion()). It started working! Thanks! Madhukar On Tue, Nov 1, 2016 at 10:29 PM, Madhukar Bharti wrote: > Hi

RE: windowing with the processor api

2016-11-02 Thread Hamza HACHANI
Hi Eno, What I want to say is that i don't find a place where to define the size of the window and where to precise the time of the advance. Hamza Thanks De : Eno Thereska Envoyé : mardi 1 novembre 2016 22:44:47 À :

Re: windowing with the processor api

2016-11-02 Thread Eno Thereska
Hi Hamza, Are you getting a particular error? Here is an example : Stores.create("window-store") .withStringKeys() .withStringValues() .persistent() .windowed(10, 10, 2, false).build(),

windowing with the processor api

2016-11-02 Thread Hamza HACHANI
Hi, I would like to know if somebody has an idea how to define the size of the window in the processor api. I've been blocked for 6 days looking for a solution. using : Stores.create(...).withStringKeys().withStringValues().persistent().windowed(...).build() I was able to define the

Re: Question regarding dynamic subscriber environment

2016-11-02 Thread Gerrit Jansen van Vuuren
Hi, Have a look at the kafka client lib https://github.com/gerritjvv/kafka-fast#java-1, it already provides this functionality. On Wed, Nov 2, 2016 at 2:34 AM, Janagan Sivagnanasundaram < janagan1...@gmail.com> wrote: > Kafka's current nature is does not support to dynamic subscriber >