Hello

Can you share a bit more about the details of the ConsumeKafka
processor and its configuration.  What are the settings you have?  Can
you describe a bit more about the input data and how you're
determining there is loss?

Thanks
Joe

On Wed, Jan 25, 2017 at 2:47 PM, Samra Kasim
<[email protected]> wrote:
> Hi,
>
> I am new to NiFi and I am reading off a Kafka topic that has 3 partitions.
> In my Nifi flow, I have 3 ConsumeKafka processors with the same groupId and
> Topic. However, when I push large datasets (e.g., 200,000+), 300-400 records
> don't make it to the next processor. This only happens when I have the
> Concurrent Tasks in the Scheduling Tab set to more than 1 (e.g., 2 or 3). If
> I have the Concurrent Tasks set to 1 then all the records make it through to
> the next processor just fine.
>
> I may need to define kafka.partitions to have each Nifi processor point to a
> specific Kafka partition, but am not sure where/how to do that. I tried
> adding it to the properties, but that doesn't work. Has anyone else worked
> through this issue?
>
> I am using Nifi 1.1.1 and Kafka 0.9
>
> --
>
>
> Sam
>
>

Reply via email to