On Sun, Jan 17, 2010 at 12:54 PM, Abraham Williams 4bra...@gmail.comwrote:
From the numbers I've seen in this thread more then 95% of accounts are are
followed less then 25k times. It would not seem to make sense for Twitter to
support returning more then 25k ids per call. Especially since
Yet, those 775 accounts have the potential ability to reach up to 775,000+
(+, considering the number of retweets they each get) of Twitter's user
base. When they're dissatisfied, people hear. IMO those are the ones
Twitter should be going out of their way to satisfy. Add to that the fact
From the numbers I've seen in this thread more then 95% of accounts are are
followed less then 25k times. It would not seem to make sense for Twitter to
support returning more then 25k ids per call. Especially since there are
only ~775 accounts with more then 100k followers:
We won't immediately remove the unbound search (defaulting no cursor
to the first).
Details:
http://groups.google.com/group/twitter-development-talk/browse_frm/thread/a0ba66db0e86941d
On Jan 7, 9:03 pm, Zaudio si...@z-audio.co.uk wrote:
Yes - Please can we have that urgently - yes or no?
How much larger do you think makes it easier?
On Jan 7, 6:42 pm, st...@implu.com st...@implu.com wrote:
I would agree with several views expressed in various posts here.
1) A cursor-less call that returns all IDs makes for simpler code and
fewer API calls. i.e. less processing time.
2) If
100k, at the minimum.
On 1/8/10 3:35 PM, Wilhelm Bierbaum wrote:
How much larger do you think makes it easier?
On Jan 7, 6:42 pm, st...@implu.com st...@implu.com wrote:
I would agree with several views expressed in various posts here.
1) A cursor-less call that returns all IDs makes for
As large as possible. 100k would be a huge improvement.
For FriendOrFollow.com I need the user's entire social graph to
effectively calculate who's not following them back, who they're not
following back, and their mutual friendships. I can't really cache
this data because user's make decisions
What proportion of your users have more than 5k followers? More than 25k
followers?
-John Kalucki
http://twitter.com/jkalucki
Services, Twitter Inc.
On Fri, Jan 8, 2010 at 2:57 PM, DustyReagan dustyrea...@gmail.com wrote:
As large as possible. 100k would be a huge improvement.
For
On 1/8/10 5:59 PM, John Kalucki wrote:
What proportion of your users have more than 5k followers? More than 25k
followers?
Good point ...
| grouping | percent |
+--+-+
| 0-4,999 |72.7 |
| 5,000-24,999 |22.3 |
| 25,000+ | 5.0 |
I think 27% of users
Here's some rough numbers...x is the number of twitter user's with a
follower count of...
x = 100k 7140.007%
75k = x 100k 1510.001%
50k = x 75k 4110.004%
25k = x 50k 20440.020%
0 x 25k 10009489 96.529%
Total: 10,369,396
So I would
I would agree with several views expressed in various posts here.
1) A cursor-less call that returns all IDs makes for simpler code and
fewer API calls. i.e. less processing time.
2) If we must have a 'cursored' call then at least allow for cursor=-1
to return a larger number than 5k.
-Steve
Yes - Please can we have that urgently - yes or no?
Thanks
Simon
On Jan 6, 8:15 pm, PJB pjbmancun...@gmail.com wrote:
Can we please get some confirmation that the cursor-less calls won't
be going away this coming Monday?
On Dec 22 2009, 4:13 pm, Wilhelm Bierbaum wilh...@twitter.com wrote:
This blog post by Anil Dash makes an excellent case for why Twitter
should cap the number of followers that a Twitter account can have. It
will make life easier for everyone.
http://bit.ly/6Al7TU
Not really sure how capping followers would be of much benefit.
A better solution might be better garbage collection of inactive or
spam accounts.
I believe twitter already does this, maybe not the best it could, but
there is something in place.
Capping the follower limit will hurt users who
That post is a follow up to his argument for why the SUL doesn't represent
as much value as some might perceive it to. It's an argument for getting rid
of the SUL as it's currently implemented. There are only 500 or so people on
the SUL. Non SUL users with as many followers, though rare, likely
Cache larger social graphs somewhere in API-ready format. Nobody will
know or probably care if a 500K social graph is outdated by an hour.
On Jan 6, 2:31 pm, Marcel Molina mar...@twitter.com wrote:
That post is a follow up to his argument for why the SUL doesn't represent
as much value as some
Regardless, the cursor set * request throttle limit should be = the
greatest number of followers on Twitter (whether that's a cap, or the
current king of Twitter).
Anil makes a case for how many of those users are really engaged
followers.
Fast Company had a piece Mr. Social: Ashton Kutcher
I am very happy with my small following. If I had even just 1 million
followers, I would feel an unbearable pressure to utter something
profound at least once every hour.
On Jan 6, 12:58 pm, Ian Irving ian.irv...@gmail.com wrote:
Regardless, the cursor set * request throttle limit should be =
Can we please get some confirmation that the cursor-less calls won't
be going away this coming Monday?
On Dec 22 2009, 4:13 pm, Wilhelm Bierbaum wilh...@twitter.com wrote:
We noticed that some clients are still calling social graph methods
without cursor parameters. We wanted to take time to
If I can suggest you keep it backwards-compatible that would make much more
sense. I think we're all aware that over 200,000 or so followers it breaks.
So what if you kept the cursor-less nature, treat it like a cursor, but set
the returned cursor cap to be 200,000 per cursor? Or if it needs to
That sounds like a good overall technique. It's very best-effort. I'm
concerned about implementation details though. The webserver may
defensively time out the connection a lot, and tight coordination
between container and process is difficult to manage in our stack. And
by difficult, I mean
The Platform team can create a static parameter that governs the
maximum size of the initial cursor. The API software will then use
this parameter value to build the list of ids returned in the first
API call.
The team can either modify this static parameter manually, or you can
create a process
Ditto
On 1/4/10 7:58 PM, Jesse Stay jesses...@gmail.com wrote:
Ditto PJB :-)
On Mon, Jan 4, 2010 at 8:12 PM, PJB pjbmancun...@gmail.com wrote:
I think that's like asking someone: why do you eat food? But don't say
because it tastes good or nourishes you, because we already know
that!
I have a GreaseMonkey script ( http://userscripts.org/scripts/show/64286
) which displays the Follows in Common and the Follow Rank for a
given users profile page using the Friends and Follows api requests.
I've implement cursor style json request to the api, but have several
concerns around the
Dewald, it should be noted that, of course, not all 200 request responses
are created equal and just because pulling down a response body with
hundreds of thousands of ids succeeds, it doesn't mean it doesn't cause a
substantial strain on our system. We want to make developing against the API
as
I think that's like asking someone: why do you eat food? But don't say
because it tastes good or nourishes you, because we already know
that! ;)
You guys presumably set the 5000 ids per cursor limit by analyzing
your user base and noting that one could still obtain the social graph
for the vast
I'm just now noticing this (I agree - why was this being announced over the
holidays???) - this will make it near impossible to process large users.
This is a *huge* change that just about kills any of the larger services
processing very large amounts of social graph data. Please reconsider
Ditto PJB :-)
On Mon, Jan 4, 2010 at 8:12 PM, PJB pjbmancun...@gmail.com wrote:
I think that's like asking someone: why do you eat food? But don't say
because it tastes good or nourishes you, because we already know
that! ;)
You guys presumably set the 5000 ids per cursor limit by
Some quick benchmarks...
Grabbed entire social graph for ~250 users, where each user has a
number of friends/followers between 0 and 80,000. I randomly used
both the cursor and cursor-less API methods.
5000 ids
cursor: 0.72 avg seconds
cursorless: 0.51 avg seconds
5000 to 10,000 ids
cursor:
The backend datastore returns following blocks in constant time,
regardless of the cursor depth. When I test a user with 100k+
followers via twitter.com using a ruby script, I see each cursored
block return in between 1.3 and 2.0 seconds, n=46, avg 1.59 seconds,
median 1.47 sec, stddev of .377,
Also, how do we get a business relationship set up? I've been asking for
that for years now.
Jesse
On Mon, Jan 4, 2010 at 10:16 PM, Jesse Stay jesses...@gmail.com wrote:
John, how are things going on the real-time social graph APIs? That would
solve a lot of things for me surrounding this.
On Jan 4, 8:58 pm, John Kalucki j...@twitter.com wrote:
at the moment). So, it seems that we're returning the data over home
DSL at between 2,500 and 4,000 ids per second, which seems like a
perfectly reasonable rate and variance.
It's certainly not reasonable to expect it to take 10+
Ryan Sarver announced that we're going to provide an agreement
framework for Tweet data at Le Web last month. Until all that
licensing machinery is working well, we probably won't put any effort
into syndicating the social graph. At this point, social graph
syndication appears to be totally
The existing APIs stopped providing accurate data about a year ago
and degraded substantially over a period of just a few months. Now the
only data store for social graph data requires cursors to access
complete sets. Pagination is just not possible with the same latency
at this scale without an
As noted in this thread, the fact that cursor-less methods for friends/
followers ids will be deprecated was newly announced on December 22.
In fact, the API documentation still clearly indicates that cursors
are optional, and that their absence will return a complete social
graph. E.g.:
~300,000 ids in 15 seconds:
$ time curl http://twitter.com/followers/ids.xml?screen_name=dougw /
dev/null
% Total% Received % Xferd Average Speed TimeTime
Time Current
Dload Upload Total Spent
Left Speed
100 5545k 100 5545k0 0 346k
Again, ditto PJB - just making sure the Twitter devs don't think PJB is
alone in this. I'm sure Dewald and many other developers, including those
unaware of this (is it even on the status blog?) agree. I'm also seeing
similar results to PJB in my benchmarks. cursor-less is much, much faster.
At
Why is Twitter announcing a major API change over the holidays? Why
are they giving us just a few (mostly holiday!) days to account for
it?
Please either a) preserve the existing calls, or b) give us a 3 month
window before deprecating. This is a new API change.
Given that many have commented
I 2nd Dewald's sentiments.
On Dec 27, 8:29 pm, Dewald Pretorius dpr...@gmail.com wrote:
What is being deprecated here is the old pagination method with the
page parameter.
As noted earlier, it is going to cause great pain if the API is going
to assume a cursor of -1 if no cursor is
I agree 100%.
Calls without the starting cursor of -1 must still return all
followers as is currently the case.
As a test I've set my system to use cursors on all calls. It inflates
the processing time so much that things become completely unworkable.
We can programmatically use cursors if
I agree with the others to some extent. Although its a good signal to stop
using something ASAP when something is depreciated, saying depreciated and
not giving definite time-line on it's removal isn't good either. (Source
params are deprecated but still work and don't have solid deprecation date,
What is being deprecated here is the old pagination method with the
page parameter.
As noted earlier, it is going to cause great pain if the API is going
to assume a cursor of -1 if no cursor is specified, and hence enforce
the use of cursors regardless of the size of the social graph.
The API
+1 - I'm currently relying on retrieving a complete social graph when
no cursor is passed. Your announcing this change right around Xmas+new
years to take effect almost immediately thereafter...
On Dec 23, 2009, at 10:00 PM, PJB pjbmancun...@gmail.com wrote:
Why hasn't this been
I agree with PJB. The previous announcements only said that the
pagination will be deprecated.
1.
http://groups.google.com/group/twitter-api-announce/browse_thread/thread/41369cb133175d0f#
2.
http://groups.google.com/group/twitter-api-announce/browse_thread/thread/52d4e68040d4ca45#
However,
Willhelm:
Your announcement is apparently expanding the changeover from page to
cursor in new, unannounced ways??
The API documentation page says: If the cursor parameter is not
provided, all IDs are attempted to be returned, but large sets of IDs
will likely fail with timeout errors.
API documentation page says: If the cursor parameter is not
provided, all IDs are attempted to be returned, but large sets of IDs
will likely fail with timeout errors.
There was no reference in any of the previous announcements to
deprecating this valuable ability. Is it now also doomed for
yes - if you do not pass in cursors, then the API will behave as though you
requested the first cursor.
Willhelm:
Your announcement is apparently expanding the changeover from page to
cursor in new, unannounced ways??
The API documentation page says: If the cursor parameter is not
Why hasn't this been announced before? Why does the API suggest
something totally different? At the very least, can you please hold
off on deprecation of this until 2/11/2010? This is a new API change.
On Dec 23, 7:45 pm, Raffi Krikorian ra...@twitter.com wrote:
yes - if you do not pass in
48 matches
Mail list logo