Thanks a lot Micheal.
I used WallClockTimeStampExtractor for now.
Thanks,
Vivek
> On Jul 8, 2016, at 1:25 AM, Michael Noll wrote:
>
> Vivek,
>
> in this case you should manually embed a timestamp within the payload of
> the produced messages (e.g. as a Long field in an Avro-encoded message
> v
@Ismael: thanks for clarification. Understood question not correctly...
@Vivek: You can also go with processing-time (or ingestion-time)
semantics if you cannot embed a timestamp in the data itself.
See http://docs.confluent.io/3.0.0/streams/concepts.html#time for more
details.
-Matthias
On 07/
Vivek,
in this case you should manually embed a timestamp within the payload of
the produced messages (e.g. as a Long field in an Avro-encoded message
value). This would need to be done by the producer.
Then, in Kafka Streams, you'd need to implement a custom
TimestampExtractor that can retrieve
Thats right Ismael, I am looking for work arounds either on 0.9.0.1
Producer side or on the Kafka Streams side so that I can process messages
produced by 0.9.0.1 producer using Kafka Streams Library.
Thanks,
Vivek
On Thu, Jul 7, 2016 at 9:05 AM, Ismael Juma wrote:
> Hi,
>
> Matthias, I think Vi
Hi,
Matthias, I think Vivek's question is not whether Kafka Streams can be used
with a Kafka 0.9 broker (which it cannot). The question is whether Kafka
Streams can process messages produced with a 0.9.0.1 producer into a
0.10.0.0 broker. Is that right? If so, would a custom TimestampExtractor
wor
Hi Vivek,
Kafka Streams works only with Kafka 0.10 (but not with 0.9).
I am not aware of any work around to allow for 0.9 usage.
-Matthias
On 07/07/2016 05:37 AM, vivek thakre wrote:
> Can kafka streams library work with the messages produced by 0.9.0.1
> producer?
> I guess not since the old
Can kafka streams library work with the messages produced by 0.9.0.1
producer?
I guess not since the old producer would not add timestamp. ( I am getting
invalid timestamp exception)
As I cannot change our producer application setup, I have to use 0.9.0.1
producer.
Is there a workaround that I can