Hi,
sounds like you really want to look into the streaming API instead.
http://dev.twitter.com/pages/streaming_api_methods#statuses-filter
cheers
-m
On Thu, Jun 2, 2011 at 1:00 PM, HRyba wrote:
> I'm developing an application that uses the Twitter Search API. The
> app searches Twitter for ma
I'm developing an application that uses the Twitter Search API. The
app searches Twitter for many (at least a couple thousand) specific
keywords in real time.
A server would be set up to get the results for the many keywords in
tweets and store them in a database that the application would access
Chad,
Sorry for not being clear. I was thinking about Abraham William's
suggestion above where Twitter Search API works with authenticated
sessions+rate limiting, instead of IP based rate filtering. Just so
you know, AppEngine has 30 second timeout on request to all AppEngine
urls, and 10 second
On Sun, Oct 18, 2009 at 8:09 AM, vivekpuri wrote:
>
> Will someone from Twitter please respond if there is an ETA to resolve
> this issue. Work arounds can never be really as effective as the real
> deal.
Sorry, I thought it was clear from the previous email. There is no ETA
because it's not goi
Will someone from Twitter please respond if there is an ETA to resolve
this issue. Work arounds can never be really as effective as the real
deal.
I would recommend just using a physical server and uploading a simple
php proxy script. If you have existing webspace, it will save you the
trouble of setting up an complete ec2 build just to run a proxy
script.
On Oct 9, 7:11 pm, Akshar wrote:
> Thanks Abraham.
>
> Any pointers on how to setup
Thanks Abraham.
Any pointers on how to setup a proxy on amazon ec2 for GAE?
On Oct 8, 6:07 pm, Abraham Williams <4bra...@gmail.com> wrote:
> Pretty much. You have limited options:
> 1) Run your Search API requests through a proxy where you will have
> exclusive access to the IP.
> 2) Wait for V2
Pretty much. You have limited options:
1) Run your Search API requests through a proxy where you will have
exclusive access to the IP.
2) Wait for V2 of the Twitter API where the REST and Search APIs get
combined so you can have authenticated search queries.
3) Hope Twitter slaps some duct tape on
http://apiwiki.twitter.com/Rate-limiting states that "for cloud
platforms like Google App Engine, applications without a static IP
addresses cannot receive Search whitelisting."
Does that mean there is no way to avoid getting HTTP 503 response
codes to search requests from app engine?
On Oct 8,
Any other solutions available for app engine folks stuck out here?
Please help!
I'm noticing this exact problem as well. I'm making only a few
requests per hour. I have tried setting the user-agent but it did not
help.
Akshar
On Oct 6, 9:50 am, Chad Etzel wrote:
> Hi All,
>
> GAE sites are p
I have solved a problem like that:
While I receive an error 503 - my application continue knocking to
twitter with query.
Everything works ;)
Twitter should really in this case either white list all GAE IPs (I'm
sure an email to Google could get all IPs they use) or allow charging
API requests to an authenticated account rather than by IP (much like
the REST API does). This way each GAE application would just set up a
twitter account an
I am also facing this issue. I'm only making a couple of requests
from GAE (about 3-4) and none of them are getting through, I keep
getting the following using Twitter4J
Twitter Exception while retrieving status
twitter4j.TwitterException: 400:The request was invalid. An
accompanying erro
Hi Chad,
I am sorry but that doesn't even help in the slightest.
You are essentially saying that we shouldn't develop on the App
Engine, since would now have to also buy a proxy. Which is completely
unfeasible and defeats the purpose of why people are using the app
engine.
I understand that th
Hi All,
GAE sites are problematic for the Twitter/Search API because the IPs
making outgoing requests are fluid and cannot as such be easily
allowed for access. Also, since most IPs are shared, other
applications on the same IPs making requests mean that fewer requests
per app get through.
One w
Same here; my app runs on Google App Engine and 40% of the requests to
the Twitter Search API get the 503 error message indicating rate
limiting.
Is there anything we as app authors can do on our side to alleviate
the problem?
/Martin
On Oct 5, 1:53 pm, Paul Kinlan wrote:
> I am pretty sure
I am pretty sure there are custom headers on the App Engine that indicate
the application that is sending the request.
2009/10/5 elkelk
>
> Hi all,
>
> I am having the same issue. I have tried setting a custom user-agent,
> but this doesn't seem to affect the fact that twitter is limiting
> bas
I'm noticing this problem as well. I'm making only a couple requests
per hour. I have tried setting the user-agent and the HTTP_REFERER
headers to a custom name, but Twitter doesn't seem to care.
On Oct 5, 2:59 am, steel wrote:
> Hi. I have this problem too.
> My application does two request p
Hi all,
I am having the same issue. I have tried setting a custom user-agent,
but this doesn't seem to affect the fact that twitter is limiting
based on I.P. address. I'm only making about 5 searches an hour and
80% of them are failing on app engine due to a 503 rate limit.
Twitter needs to det
Hi. I have this problem too.
My application does two request per hour and it get "rate limit".
What is wrong? I think it is twitter's problems
On 1 окт, 01:45, Paul Kinlan wrote:
> Hi Guys,
> I have an app on the App engine using the search API and it is getting
> heavily rate limited agai
Hi Guys,
I have an app on the App engine using the search API and it is getting
heavily rate limited again this past couple of days.
I know that we are on a shared set of IP addresses and someone else could be
hammering the system, but it seems to run for weeks without seeing the rate
limit being
Dewald,
I'm not on the search team, but there are a lot of discussions over
there this morning about search api rate limits and related issues.
Search rate limiting issues (vs. www.twitter.com or api.twitter.com)
probably boil down to one of three categories:
1) Search service interruptions - We
Various APIs have their own rate limiting mechanisms. The www, search
and streaming rate limits are all customized to their usage patterns
and share little to no code and/or state.
-John
On Sep 4, 9:49 am, Reivax wrote:
> John, the original message of this thread is about rate limit being
> to
John, the original message of this thread is about rate limit being
totally erratic, as several users have noticed. here is the detail of
what I'm seeing:
http://groups.google.com/group/twitter-development-talk/browse_thread/thread/40c82b4dbc0536bd
Here is another user reporting the problem :
ht
The Search team is working on indexing latency and throughput, along
with a many other things. There have been big improvements recently
and more are on the way.
In the mean time, if you need closer to real-time results, consider
the track parameter on the Streaming API.
-John Kalucki
http://twi
Search API will rock if it would only be reliable
what we see looks to be some sort of a funky cache, a query (atom)
can be missing some latest tweets and then after a while they show up,
if you tweak the query you can see 'em.
you ever seen this problem?
also what did you do special with user
I have exchanged emails with Twitter on this and I believe they are
working on it.
We use search extensively at www.Twaller.com . The errors in search
that we are seing is as follows:
(1) HTTP status code: 403
Message:The request is understood, but it has been refused. An
accompanying erro
Twitter team, can you please do something about the performance and
rate limiting of the Search API.
It is becoming completely unworkable. I have jumped through all the
hoops, with unique User Agents, sleeping my scripts in-between API
calls, and yet the rate limiting is just becoming more severe
Hi all,
It turns out that after all of this the 503 was not the root
cause of this issue. I found a way around the proxy errors and will
keep the response code as is.
Thanks;
— Matt Sanford
On Dec 8, 2008, at 11:29 AM, Kazuho Okui wrote:
I think using 400 is much easy to handle th
I think using 400 is much easy to handle the responses than using 401.
Because I can use same http client code and same error handling code
for both search API and REST API. In my case, I wrote a error handler
which alerts a dialog whenever it gets a 401 because search API
wouldn't return 401.
On
You could compromise and do a 400.5 O_o
On Mon, Dec 8, 2008 at 11:51, Matt Sanford <[EMAIL PROTECTED]> wrote:
> Of course right after sending a lengthy public email I see something that
> could let us keep 503 and fix the proxy errors. I'm working with operations
> on that, and if it does not
> > > The error code for search rate limiting will be changing from HTTP
> > > 503 to HTTP 401 in the very near future (today or tomorrow). For
> > > details, continue reading.
> >
> > Are you sure you want to use 401 for this? 401 would indicate authorization
> > required. If you're asking fo
Of course right after sending a lengthy public email I see something
that could let us keep 503 and fix the proxy errors. I'm working with
operations on that, and if it does not pan out I'll confer with Alex
on 400 versus 401. Stay tuned.
— Matt
On Dec 8, 2008, at 09:46 AM, Alex Payne wrot
We use 400 for rate limiting on the REST API. Matt and I are
discussing whether or not this might be the correct response.
Thoughts?
On Mon, Dec 8, 2008 at 09:17, Cameron Kaiser <[EMAIL PROTECTED]> wrote:
>
>> The error code for search rate limiting will be changing from HTTP
>> 503 to HTTP
> The error code for search rate limiting will be changing from HTTP
> 503 to HTTP 401 in the very near future (today or tomorrow). For
> details, continue reading.
Are you sure you want to use 401 for this? 401 would indicate authorization
required. If you're asking for credentials, that wou
Matt is the Search API guru, indeed.
On Mon, Dec 8, 2008 at 08:16, Chad Etzel <[EMAIL PROTECTED]> wrote:
>> The Terms say: "We do not rate limit the search API under ordinary
>> circumstances, however we have put measures in place to limit the
>> abuse of our API."
>
> ...yes, which is exactly wh
Hi all,
The error code for search rate limiting will be changing from HTTP
503 to HTTP 401 in the very near future (today or tomorrow). For
details, continue reading.
Why the 401 change?
The search API rate limit is something that nobody should be
hitting in an ideal world. Last week we
> The Terms say: "We do not rate limit the search API under ordinary
> circumstances, however we have put measures in place to limit the
> abuse of our API."
...yes, which is exactly why I am asking the question in the first place.
My code already handles the error case so no browser warnings are
Hi Chad,
I'll check the logs for TweetGrid and see what's going on. I'll
send you an email once I have some information and we can work it out.
Thanks;
— Matt Sanford
On Dec 6, 2008, at 03:23 PM, Chad Etzel wrote:
Hi Matt,
I am noticing I am getting rate-limited by the Search API mo
Ah, gotcha! You can, it will just display a browser warning. Which is
not what you want :P
The Terms say: "We do not rate limit the search API under ordinary
circumstances, however we have put measures in place to limit the
abuse of our API."
Try emailing, Alex Payne, or someone at Twitter about
No, you can't do an ajax authenticated GET or POST to a 3rd-party site. I
am dynamically loading the json in the clients' browser. I would rather
know the rate limits so I can abide by them.
-Chad
On Sun, Dec 7, 2008 at 10:42 AM, fastest963 <[EMAIL PROTECTED]> wrote:
>
> Since your doing this
Since your doing this via AJAX and such, this may not be a good idea,
but you could try passing a login to Twitter and having that login
whitelisted?
Hi Matt,
I am noticing I am getting rate-limited by the Search API more and more
frequently. I just got limited with a "Retry-After" value of 800 (or about
13 minutes). I'm not sure how much more my calm can be enhanced in a 13
minute period, but this does not bode well for my search apps such
43 matches
Mail list logo