It looks like I have everything working now. The Perl library looks
OK, but I don't think the documentation on the search API page is
right:

http://apiwiki.twitter.com/Rate-limiting

"An application that exceeds the rate limitations of the Search API
will receive HTTP 503 response codes to requests. It is a best
practice to watch for this error condition and honor the Retry-After
header that instructs the application when it is safe to continue. The
Retry-After header's value is the number of seconds your application
should wait before submitting another query (for example: Retry-After:
67)."

I didn't see any "Retry-After", either on a successful search or a
search that failed on rate limits. What I did see was data like

x-ratelimit-limit = 150
x-ratelimit-remaining = 0
x-ratelimit-reset = 1263101958

which the Perl library is interpreting correctly.

And I haven't found any documentation on "x-ratelimit" anywhere - I
got that insight via breakpoints in the Perl library using Komodo. ;-)

On Jan 9, 10:27 pm, "M. Edward (Ed) Borasky" <zzn...@gmail.com> wrote:
> On Jan 9, 9:59 pm, Mark McBride <mmcbr...@twitter.com> wrote:
>
> > If you can post complete HTTP conversations of both successful and
> > failed calls (any sensitive info elided) that would be great.  If the
> > Perl library is trying to transparently get the entire social graph
> > you'll definitely get rate limited.
>
> It looks like nothing comes back until it fails. I'm running this with
> Komodo and breakpoints, and on a successful call, the Perl library
> only returns the requested array of IDs, the next cursor and the
> previous cursor. I'm going to run this by the author of the Perl
> library. It doesn't look like he's trying to get more than a page at a
> time when you specify a cursor, but I don't think he's ever tested
> something this big, so he wouldn't have run into it. I can post the
> returned HTTP for a failed one, though. ;-)

Reply via email to