Good to know. Did you mean to say "consume … streaming results"? I don't really 
see where you use the stream here.

Also, please note that it's not a good idea to work with "since_id" and 
"max_id" any more, because those will soon be (already are?) NON-SEQUENTIAL. 
This means you will lose tweets if you rely on the IDs incrementing over time. 
To quote the relevant email from Taylor Singletary:

> Please don't depend on the exact format of the ID. As our infrastructure 
> needs evolve, we might need to tweak the generation algorithm again.
> If you've been trying to divine meaning from status IDs aside from their role 
> as a primary key, you won't be able to anymore. Likewise for usage of IDs in 
> mathematical operations -- for instance, subtracting two status IDs to 
> determine the number of tweets in between will no longer be possible


On Jun 8, 2010, at 0:06 , sahmed10 wrote:

> yes it works! This algorithm works
> Its something like this
> Set the query to a string with appropriate To and From dates. Then
> consuem the 1500 streaming results and also save the status id of the
> very last tweet you got. As they are in order sequentially(with gaps)
> it wont be a problem. The very last tweet status id should be assigned
> as the MaxId for the next set of results and so on.

Reply via email to