The feature to update streaming api filter predicates has been put on
hold for now. It is still desired, but not in the active development
plan.

You can backfill with the REST API, as you suggest. Or, you could
reconnect every few minutes on the Streaming API. It might be best to
have two streaming API accounts -- a /shadow stream that is long-lived
and /follow stream that you reconnect with every few minutes when a
new user is added. You can migrate accounts from the /follow stream to
the /track stream once an hour or so. If the /follow stream runs afoul
of a rate limit, at least the /shadow stream is still connected.

-John Kalucki
http://twitter.com/jkalucki
Services, Twitter Inc.



On Jul 27, 7:18 pm, nickdella <nick.dellamaggi...@gmail.com> wrote:
> Hi,
>
> I'm working on a similar system in which members on my site
> dynamically "subscribe" to Twitter. Thus, my following list is
> constantly changing. To provide a reasonable user experience, I'd have
> to disconnect/reconnect every couple minutes to ensure their
> subscription is recognized in a timely manner :/
>
> I understand that the constant disconnect/reconnect cycle is
> suboptimal for you guys. So, I have a fallback plan of concurrently
> using both the streaming API for existing subscribers while using the
> REST API to temporarily poll for status updates for all new
> subscribers. Every hour or so, I'd merge these new subscribers into
> the main followers list and reconnect the stream. This is possible,
> but would definitely be some extra work to build.  Alex described the
> possible addition of a REST API to dynamically update the followers
> list. Is there any chance this is coming in the near future i.e. next
> 3 weeks? :)  Or do you guys have any other ideas on how I'd go about
> solving this given the current set of APIs?  Thanks!
>
> -nick
>
> On Jul 8, 4:28 pm, John Kalucki <jkalu...@gmail.com> wrote:
>
> > Alexy,
>
> > First, curl isn't the best approach for developing against theStreamingAPI. 
> > It's fine for prototyping, but it only goes so far.
>
> > Yes, the comma separated list should be all on one line if you are
> > using curl.
>
> > If you want to change the user set, you should connect with the new
> > set and then disconnect the old set immediately once the data starts
> > to flow. This will be hard to coordinate using curl. In some cases,
> > Twitter will throw the first user off once the second user connects.
> > In other cases it will be more lenient. But, beware: if you want to
> > avoid running into various abuse limits, you'd best be sure that your
> > coordination between the first and second streams are quite solid and
> > that the first stream is always terminated in a timely manner.
>
> > You can also avoid data loss by using the count parameter, available
> > on some, but not all, methods.
>
> > Please email me with your use case and I'll forward it on to the
> > Platform PM to help prioritize the better solution, as outlined by
> > Alex.
>
> > -John Kalucki
> > twitter.com/jkalucki
> > Services, Twitter Inc.
>
> > On Jul 8, 12:17 pm, braver <delivera...@gmail.com> wrote:
>
> > > Uf you have thousands of users, do you really have to cook up a
> > > following file with comma-separated say 100,000 user IDs?  Should it
> > > all be on one line?  Now what happens if we want to drop some and add
> > > some IDs -- do we have to restart and re-upload all that list again?
> > > I see when the curl -d @following ... starts up, it does that.
> > > Restarting with huge lists sounds like data loss...
>
> > > Cheers,
> > > Alexy

Reply via email to