Bumped up number of ackers to 100, which made a *huge* difference--4.3/4.4 million to 6.6 million tuples acked/minute! The capacity of my acker executors was down around 0.15, so I did not figure I needed to increase from 10 to 100, but wowsers, that one change made a major impact.
Thanks again to Kobi! --John On Mon, Aug 17, 2015 at 12:59 PM, John Yost <[email protected]> wrote: > Hi Kobi, > > Cool, thanks for getting back to me so quickly! I did confirm that > there's one instance of Bolt A (sender, 400 executors) and Bolt B > (receiver, 100 executors) on each worker (100 workers in topology), so we > should be good with local shuffling working. > > I only have 10 ackers, so I'll bump that up to 100 and see how that works. > > Thanks > > --John > > On Mon, Aug 17, 2015 at 12:26 PM, Kobi Salant <[email protected]> > wrote: > >> Hi John, >> >> You should make sure you have at least an instance of each bolt on each >> worker so local shuffling will work. Also, the number of ackers should be >> according to the number of workers. >> >> Did you check the capacity of the bolts and ackers? >> >> Kobi >> >> On Mon, Aug 17, 2015 at 7:22 PM, John Yost <[email protected]> >> wrote: >> >>> Hi Everyone, >>> >>> I updated my topology to use localOrShuffleGrouping for a Bolt that, for >>> each incoming tuple, the Bolt generates and emits 15-20 tuples. My >>> throughput went from 1 M tuples acked/minute to 4.5 million, which is >>> great, but I need to get to 7-8 million tuples acked/minute. >>> >>> Question--are there any config parameters to use specifically with >>> localOrShuffleGrouping? Please confirm, thanks! >>> >>> --John >>> >> >> >> This message may contain confidential and/or privileged information. >> If you are not the addressee or authorized to receive this on behalf of >> the addressee you must not use, copy, disclose or take action based on this >> message or any information herein. >> If you have received this message in error, please advise the sender >> immediately by reply email and delete this message. Thank you. >> > >
