Re: Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-28 Thread Guozhang Wang
Yes that is correlated, thanks for the reminder. I've updated the JIRA to reflect your observations as well. Guozhang On Wed, Mar 28, 2018 at 12:41 AM, Mihaela Stoycheva < mihaela.stoych...@gmail.com> wrote: > Hello Guozhang, > > Thank you for the answer, that could explain what is

Re: Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-28 Thread Mihaela Stoycheva
Hello Guozhang, Thank you for the answer, that could explain what is happening. Is it possible that this is related in some way to https://issues.apache.org/jira/browse/KAFKA-6538? Mihaela On Wed, Mar 28, 2018 at 2:21 AM, Guozhang Wang wrote: > Hello Mihaela, > > It is

Re: Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-27 Thread Guozhang Wang
Hello Mihaela, It is possible that when you have caching enabled, the value of the record has already been serialized before sending to the changelogger while the key was not. Admittedly it is not very friendly for trouble-shooting related log4j entries.. Guozhang On Tue, Mar 27, 2018 at 5:25

Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-27 Thread Mihaela Stoycheva
Hello, I have a Kafka Streams application that is consuming from two topics and internally aggregating, transforming and joining data. One of the aggregation steps is adding an id to an ArrayList of ids. Naturally since there was a lot of data the changelog message became too big and was not sent