Hello,

No, I am not. Also, I am using direct-grouping for sending tuples between
the spout and the bolts.

Nikos

2015-07-19 14:40 GMT-04:00 Niels Basjes <[email protected]>:

> Do you use Trident or the more low level API?
>
> Niels
>
> On Sun, Jul 19, 2015 at 7:40 PM, Nick R. Katsipoulakis
> <[email protected]> wrote:
> > Hello all,
> >
> > I have a topology in which a Spout (A) emits tuples to a Bolt (B) and in
> > turn, B emits tuples to a Bolt (C).
> >
> > In order to perform some measurements in my topology I have Spout A send
> > some two types of tuples: normal data tuples and latency-measure tuples.
> >
> > After sending a user-defined number of data tuples, A initiates a
> sequence
> > by sending a latency-tuple, with a 1 second time difference between them.
> > So, after sending the first latency-measure tuple, it sends data tuples
> > until one 1 second has passed, and then sends the next latency-measure
> > tuple. So, the input stream of B would look something like the following:
> >
> > DDDDD(L1)DDD--for 1 second--DDD(L2)DDDD....
> >
> > The strange thing I see in Bolt B is that the time difference between the
> > arrival times of L1 and L2 are not >= 1 second, which is the time gap
> that I
> > expect to see.
> >
> > Why is the above happening? Does Storm do some kind of micro-batching so
> > that the two tuples L1 and L2 appear in B with time difference less than
> 1
> > second?
> >
> > Thanks,
> > Nikos
> >
>
>
>
> --
> Best regards / Met vriendelijke groeten,
>
> Niels Basjes
>



-- 
Nikolaos Romanos Katsipoulakis,
University of Pittsburgh, PhD candidate

Reply via email to