I'm completely on board with any strategy that will simplify (or
especially amplify) the amount of graph data I can get. I had a
discussion recently with Ryan where he indicated an openness to ideas
of this sort because there is (he says) no getting around the 20K rate
limits... an idea I find preposterous, but OK... whatever! So I'll be
very interested to see if we can gain any traction on this front. I
definitely can't get the amount of data I need to keep my app
reasonably fresh with the rate limits available.
On Oct 22, 2:52 pm, Harshad RJ <harshad...@gmail.com> wrote:
> I am collating the thoughts in this thread  into a proposal to improve
> the efficiency of social-graphing applications.
> A common API access pattern for social-graphing applications seems to be:
> 1. Get the friend/follower ids of a user with [*friends/ids*] or [*
> 2. Get user details one at a time with [*users/show*]
> (This approach saves on bandwidth by not using the [*statuses/friends*]
> method, as that would return redundant info when traversing a network)
> Now, since [*users/show*] is not a paginated API, it is easily possible to
> save bandwidth and connection overhead by clubbing multiple requests in one
> call. For a social-graphing application, the amount of user information
> needed is minimal.
> For example, the following amount of information would be sufficient for my
> application :
> <?xml version="1.0" encoding="UTF-8"?>
> <created_at>Sun Mar 18 06:42:26 +0000 2007</created_at>
> <created_at>Tue Apr 07 22:52:51 +0000 2009</created_at>
> This is significantly smaller than the data returned by [*users/show*].
> To prevent misuse of the new API the following could be enforced:
> 1. A maximum limit on number of users that can be queried in one request
> 2. Rate limiting based on number of users requested. For example, if (N)
> users' details were requested in one call, count it as (N/2) requests. This
> will provide incentive for using the new API as well as dettering misuse.
> Harshad RJhttp://hrj.wikidot.com