You should continuously consume Streaming API feeds and store the results. Then, periodically run your algorithms over the stored set as required. Trending will require examining more data than every 30 seconds, especially if you are slicing by geo.
On Mon, Jan 11, 2010 at 11:51 AM, GeorgeMedia <georgeme...@gmail.com> wrote: > Thank you everyone. To quote an internet meme, "you're doing it wrong" > seems to be the consensus here. Your comments have shed light on the > dark areas for me. > > So if I may bounce this off you to see if you've successfully pointed > me in the right direction, a new and better approach would be to: > > 1) Check out the spritzer.json stream to see what type of data I get > back and formulate how I could make use of it. Basically do what ecp > suggested and check the geo/location fields for viable data then do my > own geocoding with what is in my database. Or on the fly geocoding > with sites like hostip.info or geoapi or yahooapi. > > 2) Once I have that methodology down I to grab the gardenhose (or > firehose depending on how much data I can handle) at say every 30 > seconds to grab a fresh data set so my users are seeing data that is > at most 30 seconds or so old. > > Right path or still not getting it? >