Just before 8pm PST last night both my primary and secondary listening
servers stopped receiving updates and skipped notices and are only
getting keepalive messages. I tried restarting the process, but still
only keepalives.
Was the stream turned off? Is it just me?
Thx,
Zac
you're using, the URL you're
using to connect, and the time in UTC at which the error occurred?
---Mark
On Tue, Dec 1, 2009 at 11:44 AM, Zac Witte zacwi...@gmail.com wrote:
Just before 8pm PST last night both my primary and secondary listening
servers stopped receiving updates
I'm not sure the filter is actually catching everything that I'm
supposedly tracking. There are ~20,000 tweets per minute right now
according to tweespeed. I'm getting about 1000 tweets/m and skipping
on average 1500 tweets/m according to the limit notifications. That
means my filter is matching
For most of this week I have been seeing duplicate tweets appear when
I quickly paginate through a set of results using the json search api.
This only happens when making requests in quick succession. I have
verified it in my own java application trying two different json
parsers as well as this
I'm paginating through a search query that was initially created with
a since_id parameter. I'm using the query suggested by next_page,
which includes a max_id and not a since_id, which I believe is the
correct usage. I'm still getting duplicate tweets. Is anyone else
experiencing this?
paginating when you
reach it.
Thanks;
– Matt Sanford / @mzsanford
Twitter Dev
On Jul 16, 2009, at 11:23 AM, Zac Witte wrote:
I'm paginating through a search query that was initially created with
a since_id parameter. I'm using the query suggested by next_page,
which