[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-10 Thread Waldron Faulkner
Hey developers, any hints/tips on how I can get the Twitter API team to focus on this issue? It's hard to build a business on the Twitter API when a crucial feature like this just stops working and we get radio silence for days. Any tips on how I can help the team focus on this?? On Sep 9, 10:10 

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-09 Thread alexc
this issue still pops up : http://twitter.com/friends/ids/downingstreet.xml?page=3

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-07 Thread freefall
Flat file generation and maintenance would be foolish at this stage. Seperating out the individual data sets purely for api to be served by different clusters with server side caching may fit the bill - but tbh if this isn't happening already I'll be shocked. On Sep 7, 5:40 am, Jesse Stay

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-07 Thread John Kalucki
This describes what I'd call row-based pagination. Cursor-based pagination does not suffer from the same jitter issues. A cursor-based approach returns a unique, within the total set, ordered opaque value that is indexed for constant time access. Removals in pages before or after do not affect

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-07 Thread Waldron Faulkner
I could really go for jittery right now... instead I'm getting totally broken! I'm getting two pages of results, using ?page=x, then empty. To me, it looks like all my accounts have max 10K followers. I'd love some kind of official response from Twitter on the status of paging (John?). Example:

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-06 Thread Dewald Pretorius
I meant to type, LIMIT 100, 5000.

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-06 Thread Jesse Stay
Agreed. Is there a chance Twitter can return the full results in compressed (gzip or similar) format to reduce load, leaving the burden of decompressing on our end and reducing bandwidth? I'm sure there are other areas this could apply as well. I think you'll find compressing the full social

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-06 Thread Dewald Pretorius
If I worked for Twitter, here's what I would have done. I would have grabbed the follower id list of the large accounts (those that usually kicked back 502s) and written them to flat files once every 5 or so minutes. When an API request comes in for that list, I'd just grab it from the flat

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-06 Thread Jesse Stay
The other solution would be to send it to us in batch results, attaching a timestamp to the request telling us this is what the user's social graph looked like at x time. I personally would start with the compressed format though, as that makes it all possible to retrieve in a single request. On

[twitter-dev] Re: Paging (or cursoring) will always return unreliable (or jittery) results

2009-09-06 Thread Jesse Stay
As far as retrieving the large graphs from a DB, flat files are one way - another is to just store the full graph (of ids) in a single column in the database and parse on retrieval. This is what FriendFeed is doing currently, so they've said. Dewald and I are both talking about this because