Are you reusing objects in the input operator ? If so, try creating a new
object
for each tuple.

Ram

On Tue, Dec 20, 2016 at 5:38 AM, Doyle, Austin O. <
[email protected]> wrote:

> Also as a follow up, its not just repetition of some of the data points,
> it’s also just not sending some of the data points.  When I try a 100
> record one for example (1 – 100) it only sends the data record 100 a
> hundred times instead of 1 through 100.
>
>
>
> *From:* Doyle, Austin O. [mailto:[email protected]]
> *Sent:* Tuesday, December 20, 2016 8:14 AM
> *To:* [email protected]
> *Subject:* RE: Data duplication between operators
>
>
>
> The downstream operator doesn’t seem to be failing and through some local
> tests I can confirm that each operator works separately.  Could it be
> something else?
>
>
>
> -Austin
>
>
>
> *From:* Vlad Rozov [mailto:[email protected]
> <[email protected]>]
> *Sent:* Monday, December 19, 2016 6:40 PM
> *To:* [email protected]
> *Subject:* Re: Data duplication between operators
>
>
>
> This will be a bug unless the downstream operator constantly fails and is
> restored to a checkpoint in which case it is expected that it may get the
> same tuple multiple times.
>
> Thank you,
>
> Vlad
>
> On 12/19/16 11:33, Doyle, Austin O. wrote:
>
> I’m trying to send some sequential data between an S3 Input Operator and a
> CSV Parser operator.  I added logging to see what the outputPort is
> emitting and it seems to be straightforward (data points 1 to 1000).  I
> added logging on the input of the CSV Parser which receives 1000 data
> points but not the correct data points.  It actually receives random data
> points multiple times (like point 57 twenty or so times).  Has anyone seen
> anything like this?
>
>
>
> Thanks,
>
> Austin
>
>
>

Reply via email to