Repeated automated queries to search isn't a complete strategy, as search
does not provide full corpus search and the rate limits are fairly low.
Trending can usually be done on Spritzer or the Gardenhose. I'd grab
Spritzer and see what you can determine by examining the Location field and
the geotag. Trying to first filter by location doesn't seem to be a good
strategy, as it's going to be hard to get enough data for each location. The
Gardenhose gives you plenty of data when you are ready.

-John Kalucki
http://twitter.com/jkalucki
Services, Twitter Inc.



On Mon, Jan 11, 2010 at 6:09 AM, GeorgeMedia <georgeme...@gmail.com> wrote:

> ecp and Mark thanks.
>
> I understand what you're saying but I'm having a hard time grasping
> how the streaming api would be better. Here's why: Like I said
> earlier, I have over 200,000 locations (and growing) in my database.
> I'll soon be pulling them in dynamically from other geo location api's
> so I'll in effect have the entire world! (insert evil world domination
> laugh). And users will be looking for locations randomly so... how do
> you cache that? I'm stumped.
>
> The problem is, like you said relying on the location field would be a
> bit of a hairball. So my 2 questions are.... how do I send geo
> parameters to the firehose/streaming api? And will it return the same
> results set as if I did it via the search api with the geocode=
> parameter. Also, can I specify radius like in the search api?
>
> Is what you are suggesting to basically consume the entire firehose as
> it comes in and look for geo parameters/locations in real time? My
> application is simply a way to let users come to the site and look for
> local tweets and trends by city.
>
> Example: http://www.twocals.com/twt/il/belleville
>
>
>
>

Reply via email to