To me it sounds like a setup that you could do with a centralised data

I would be tempted to build some sort of Twitter proxy service which
all your requests go through. You could mimic the Twitter API however
yours could including a caching mechanism which would support all of
your other sites. If you mimic the API correctly you wont have to
change any of your existing codebase except the hostname that you are
making the call to.

However if you think the widget might work then maybe run a few tests
that way, it is by far the simplest option and i have found it to work
great for smaller client projects. Twitter support are always good to
iron out any vagueness to the API restrictions in place, you might be
best filing a support ticket.


On Jan 13, 4:45 pm, Jason King <> wrote:
> Hi Josh,
> Thank you for getting back to me.
> I am currently at the stage of evaluating the best approach to use.
> My level of transactions per second (tps) and data staleness
> requirements may well push me over the whitelisted call rate at times
> (20000 per hour / 5.5tps ) even with caching. Some pages are much more
> popular than others, so caching will help me to some extent.
> I could build a server side service / caching layer and limit the
> number of calls to 20000 per hour.
> I have seperate instances of my website in Europe and North America
> also - so single connection limit per account limitation rules out the
> streaming API and it also makes it hard to know the global number of
> API calls my website is making at a given time.
> I like the attraction of deploying the twitter widget, as the calls
> then come from the users browsers and no limit appears to apply. But,
> with this solution there would be 0 caching.
> I want to be a good citizen, and am happy to eliminate the twitter
> widget if I have to, but it is quite an attractive approach for me.
> Thanks
> Jason
> On Jan 13, 2:11 pm, joshnesbitt <> wrote:
> > Hi Jason,
> > As far as im aware the widget does not actually get rate limited, i'm
> > sure i read somewhere that this was not included in the normal rate
> > limit (but don't quote me on that).
> > The best way is to use an effecting caching technique. If you cache
> > the content every x minutes you should be covered if a certain page
> > was to get hammered.
> > What is the context of your application and have you considered white
> > listing if you really need to be using the API for every request?
> > Regards,
> > Josh
> > On Jan 13, 9:40 am, Jason King <> wrote:
> > > Hi,
> > > I am investigating options for showing a twitter widget on a website.
> > > I am considering using the Twitter provided Profile Widget (http://
> > >
> > > I wanted to clarify what rate limiting policy (if any) applies to this
> > > widget?
> > > From reading the documentation, the widget does not make authenticated
> > > calls, and therefore a client IP limit would be applied. As the call
> > > to twitter is made via javascript from the client browser, the limit
> > > would then be placed on the client IP (150 calls per hour).
> > > The widget will show for various twitter users who register with my
> > > site (i.e. they say what their twitter name is and on their public
> > > page, the twitter widget will show their last couple of tweets). The
> > > render rate will be much higher than 150 per hour in total, but for an
> > > individual browser should be much less.
> > > Finally, what happens when a large number of clients (browsers) are
> > > behind a proxy such as at a university or large company?
> > > Thank you
> > > Jason

Reply via email to