unless I miss something, it's usually user's responsibility to dedup returned tweets on the client side. if you see duplicates between two feeds, just remove the duplicates. this is what client application should have in any case.
if you see no fresh tweets but only old tweets, there may be a possibility that twitter returns only cashed results because you api calls exceed rate-limit. I'm not sure, though. does any one know about rate-limit for using search feed http://search.twitter.com/search.atom<http://search.twitter.com/search.atom?geocode=19.017656%2C72.856178%2.> ? -aj On Tue, Dec 1, 2009 at 8:49 PM, enygmatic <enygma...@gmail.com> wrote: > Hi, Raffi > Were you able to raise the cache issue with the search team? > Seems the problem is worse than I thought. I have run my script > (getting 25 results from search every 15 minutes, for Mumbai) for two > days. The first day had 71% duplicate results due to the caching > issue, while the second day fetched an amazing 90% duplicates. With > these kind of results, I think it’s probably quite useless for me to > even use the search API . > So would appreciate if you could let me know if there is a chance that > this issue may be resolved in the near future or if location specific > streams would be available via the streaming API anytime soon. I > understand that the twitter dev team has a lot on its hands, so it > would be understandable if this isn’t anywhere in the list of features > they intend to ship out in the near future. However, would definitely > appreciate it if you could let me know if anything could be done or > not. > Thanks and Regards, > Elroy Serrao > > > On Nov 28, 7:45 pm, Raffi Krikorian <ra...@twitter.com> wrote: > > unfortunately, there is no (current) way to subscribe to the streaming > > API for a particular location. as for the caching issue on the > > search, that's unfortunate, and i'll try to raise the issue with the > > search team next week. > > > > > > > > > > > > > @Abraham > > > I actually use the geocode with the search api for my script, so using > > > the search api isn't my problem. My problem is that I get "stale" > > > results from the search cache, even when querying after a sufficient > > > interval. Also the "stale" results seem hours old (at times, in fact > > > yesterday at 23:00 hours I got a few results that were from > > > 22:00-22:30 hours. Didn't have the problem when using twitter search > > > from the browser). To overcome this Raffi Krikorian suggested using > > > the streaming api instead of the search api. My question was - how do > > > i get a location specific stream using the streaming api. From the > > > streaming api docs, there doesn't seem a way to do this at the moment, > > > which kind of defeats my purpose as I need to the deploy the script in > > > the next one week or so. Guess I'll have to live with the stale > > > results... > > > > > Anyway thanks for the help. > > > > > On Nov 28, 12:40 am, Abraham Williams <4bra...@gmail.com> wrote: > > >> On Fri, Nov 27, 2009 at 12:38, enygmatic <enygma...@gmail.com> wrote: > > >>> From what I have > > >>> gone through so far, there doesn't seem to be a way to query for > > >>> status updates from a certain geographical location, say limited > > >>> to a > > >>> city. I may be mistaken here, so do correct me if I am wrong. > > > > >> Check out the search operators:http://search.twitter.com/operators > > > > >> For example:http://search.twitter.com/search?q=near:NYC+within:15mi > > > > >> Abraham > > >> -- > > >> Abraham Williams | Community Evangelist |http://web608.org > > >> Hacker |http://abrah.am|http://twitter.com/abraham > > >> Project | Awesome Lists |http://twitterli.st > > >> This email is: [ ] blogable [x] ask first [ ] private. > > >> Sent from Madison, WI, United States > > > > -- > > Raffi Krikorian > > Twitter Platform Team > > ra...@twitter.com | @raffi- Hide quoted text - > > > > - Show quoted text - > -- AJ Chen, PhD Chair, Semantic Web SIG, sdforum.org http://web2express.org @web2express on twitter Palo Alto, CA, USA 650-283-4091 *Monitor realtime web and follow trending topics with semantic intelligence*