Re: [twitter-dev] Re: Twitter backup script

2010-07-19 Thread Tomas Roggero
Hi Pascal

What I'm doing is requesting 150 per hour. I've 43 pages, so in about a week
I'll get my almost full backup. :D

(I've wrote a PHP script to do that automatically, of course)

Does the Congress Library have an API or some ?

2010/7/18 Pascal Jürgens lists.pascal.juerg...@googlemail.com

 Tom,

 at least you know that the library of congress has a backup :)

 Pascal

 On Jul 18, 2010, at 7:07 , Tom Roggero wrote:

  I've tried your script on Mac, it only works for first 3 pages that's
  weird (i'm running darwin ports for xml functions)...
  Anyway, tried manually do it through firefox, latest page is 16.
  That's the limit. But if you have the ID of the previous tweets you
  could use statuses/SHOW for that ID... the problem is:
 
  IF you are backing up your account because you have all your tweet
  IDs, you will need 1 request per tweet, and maximum is 350 (with HUGE
  luck) per hour In my case, I got 10k, 3k via REST and 7k left to
  do... 7 THOUSAND REQUEST???
 
  Come on twitter, help us, developers, start thingking on backups, we
  know you are gonna explode!.




-- 
Saludos,

Tomás Roggero
http://www.flavors.me/tr


Re: [twitter-dev] Re: Twitter backup script

2010-07-19 Thread Pascal Jürgens
Thomas,

last time I heard from the project, they were busy sorting the technical 
details out and still not sure who would even get access. It'll probably be 
open to a selected group of researchers first.

Pascal

On Jul 18, 2010, at 8:16 PM, Tomas Roggero wrote:

 Hi Pascal
 
 What I'm doing is requesting 150 per hour. I've 43 pages, so in about a week 
 I'll get my almost full backup. :D
 
 (I've wrote a PHP script to do that automatically, of course)
 
 Does the Congress Library have an API or some ?



[twitter-dev] Re: Twitter backup script

2010-07-18 Thread Tom Roggero
I've tried your script on Mac, it only works for first 3 pages that's
weird (i'm running darwin ports for xml functions)...
Anyway, tried manually do it through firefox, latest page is 16.
That's the limit. But if you have the ID of the previous tweets you
could use statuses/SHOW for that ID... the problem is:

IF you are backing up your account because you have all your tweet
IDs, you will need 1 request per tweet, and maximum is 350 (with HUGE
luck) per hour In my case, I got 10k, 3k via REST and 7k left to
do... 7 THOUSAND REQUEST???

Come on twitter, help us, developers, start thingking on backups, we
know you are gonna explode!.



On 22 jun, 18:51, Kai Hendry kai.hen...@gmail.com wrote:
 Hello everyone,

 I wrote a twitterbackupscript for shell geeks like myself. I would
 appreciate review and any constructive 
 comments:http://twitter.natalian.org/hgweb.cgi/file/tip/fetch-tweets.sh

 Where can I findhttp://api.twitter.com/1/statuses/user_timeline.xml
 onhttp://dev.twitter.com/statusbtw?

 Many thanks from a warm summer's night in England,


Re: [twitter-dev] Re: Twitter backup script

2010-07-18 Thread Pascal Jürgens
Tom,

at least you know that the library of congress has a backup :)

Pascal

On Jul 18, 2010, at 7:07 , Tom Roggero wrote:

 I've tried your script on Mac, it only works for first 3 pages that's
 weird (i'm running darwin ports for xml functions)...
 Anyway, tried manually do it through firefox, latest page is 16.
 That's the limit. But if you have the ID of the previous tweets you
 could use statuses/SHOW for that ID... the problem is:
 
 IF you are backing up your account because you have all your tweet
 IDs, you will need 1 request per tweet, and maximum is 350 (with HUGE
 luck) per hour In my case, I got 10k, 3k via REST and 7k left to
 do... 7 THOUSAND REQUEST???
 
 Come on twitter, help us, developers, start thingking on backups, we
 know you are gonna explode!.