setting max.request.size worked when i did it in producer code; don't
understand what producer.properties is for, seems like it's not used.
On Thu, Aug 15, 2019 at 2:25 PM Jonathan Santilli <
jonathansanti...@gmail.com> wrote:
> I have asked because I did not see that in your previous email when
I have asked because I did not see that in your previous email when you
tried the console producer.
Jonathan.
On Thu, Aug 15, 2019, 3:07 PM l vic wrote:
> yes, in producer.properties
>
> On Thu, Aug 15, 2019 at 9:59 AM Jonathan Santilli <
> jonathansanti...@gmail.com> wrote:
>
> > Just to be
yes, in producer.properties
On Thu, Aug 15, 2019 at 9:59 AM Jonathan Santilli <
jonathansanti...@gmail.com> wrote:
> Just to be sure, please confirm the configuration parameter is well
> set/configure at producer level:
>
> max.request.size = 12390 (for instance)
>
> Cheers!
> --
> Jonathan
I had a similar problem before. I could find no way for the producer to
determine the smallest maximum buffer size of the servers.
We had an issue where we had very large items that we wanted to send through
kafka, basically to tunnel through a firewall.
We utilized an open source project
I tested it with kafka-console-consumer and kafka-console-producer reading
from 16M text file (no newlines):
kafka-console-producer.sh --broker-list :6667 --topic test <
./large-file
The error comes out on producer side:
org.apache.kafka.common.errors.RecordTooLargeException: The message is
I tested it with kafka-console-consumer and kafka-console-producer reading
from large text file (no newlines):
kafka-console-producer.sh --broker-list 10.10.105.24:6667 --topic test <
./large-file
On Thu, Aug 15, 2019 at 4:49 AM Jonathan Santilli <
jonathansanti...@gmail.com> wrote:
>
Yes, it's still there
On Thu, Aug 15, 2019 at 4:49 AM Jonathan Santilli <
jonathansanti...@gmail.com> wrote:
> Hello, try to send and flush just one message of 16777239 bytes, to verify
> the error still shows up.
>
> Cheers!
> --
> Jonathan
>
>
>
> On Thu, Aug 15, 2019 at 2:23 AM l vic wrote:
Please forgive this question if it turns out to be trivial but I am new to
Kafka and am evaluating it for use as the backbone of our new payments
application. I envisage a 'payment' topic being fed by a producer for each
payment channel, each payment being represented in JSON with an attribute
Hello, try to send and flush just one message of 16777239 bytes, to verify
the error still shows up.
Cheers!
--
Jonathan
On Thu, Aug 15, 2019 at 2:23 AM l vic wrote:
> My kafka (1.0.0) producer errors out on large (16M) messages.
> ERROR Error when sending message to topic test with key:
Thanks for the reply. Actually my Lambda consumers are consuming batched
messages from a Kinesis queue in AWS, process them, and send results to Kafka.
Even with 'reserve concurrency' AWS will frequently stop and
re-initiate/-invoke the function for different batches - resulting in
Even if it is not a memory leak it is not a good practice. You can put the
messages on SQS and have a lambda function listening to the SQS queue with
reserve concurrency to put it on Kafka
> Am 15.08.2019 um 08:52 schrieb Tianning Zhang
> :
>
> Dear all,
>
> I am using Amazon AWS Lambda
Dear all,
I am using Amazon AWS Lambda functions to produce messages to a Kafka cluster.
As I can not control how frequently a Lambda function is initiated/invoked and
I can not share object between invocations - I have to create a new Kafka
producer for each invocation and clean it up after
12 matches
Mail list logo