On Tue, Feb 23, 2010 at 11:30 AM, andy e <[email protected]> wrote: > Thanks for the feedback. Interesting point about combining pub/hub > into a single entity, hadn't thought about that. From the subscriber's > POV, it's basically all the same and that might increase throughput > (if we wanted to go that route). Thanks!
Yep! This is what MySpace does, for example. > On Mon, Feb 22, 2010 at 10:50 PM, Brett Slatkin <[email protected]> wrote: >> I've tested a feed through the reference hub at around 1 ping per >> second (with N updates in the feed per ping) and it works well. I >> think anything beyond 1/sec per feed is more difficult to achieve >> because you're limited by the spooling nature of a hub that tries to >> guarantee delivery (by writing things to disk). But the good news is >> each feed fetch can contain multiple entries. >> >> This changes if the publisher and hub are combined into a single >> entity. Then you can push updates basically as fast as you can >> generate them in your system, instead of having to rely on additional >> feed fetching and deduping. >> >> On Mon, Feb 22, 2010 at 9:41 PM, andy e <[email protected]> wrote: >>> What's the most volume anyone has seen on a PuSH enabled topic? Any >>> examples? >>> >>> i.e. a Gawker a may only see 10-20 new posts a day (just a guess) and >>> probably lots of volume from hub->subscribers in a case like that. >>> What about strictly from producer to hub? Is there any "upper limit" >>> that should be avoided? >>> >>> I ask in the case of a small feed that we want to create, but one >>> where updates happen constantly. Each item in the topic would be >>> updated every 5-10 secs, but the topic may only contain 20 or so >>> entries/items. >>> >>> Just kind of curious how this is handled and/or avoided. >>> >>> andy >>> >> >
