[twitter-dev] Re: Twitter Apps
Your social app should probably consume the /follow resource in the Streaming API. Once you provide a list of all the userids on your service, you'll receive all public posts from those users with very low latency. This is the basic integration strategy for peering Twitter with social networks. -John Kalucki Services, Twitter Inc. On Jun 7, 12:02 am, umaydin devnetw...@gmail.com wrote: Hello, I have a social network and I want my members share their twitter updates on profile pages as same as in facebook, friendfeed, etc. My updates are going to be shown in 5 seconds in friendfeed. Is there any trigger for the applications? Or friendfeed pings the server for each user? I use Sign in with Twitter method and I have an application to do that. Thank you for your assistance.
[twitter-dev] Re: How does the trends method in search API work?
wat do u wanna know... the json response wil give u the latest 10 trends n the timing format u can parse it using jquery...the api doc explains it well...wat api r u working on??? On Jun 8, 10:18 am, zvn zvn750...@gmail.com wrote: Hi, I'm a master student in Taiwan. I want to do some work to improve the trends API inhttp://apiwiki.twitter.com/Twitter-Search-API-Method%3A-trends. But I didn't know how does this api work, so it's hard for me to make a comparison between my method and the API, would any doby please tell me how to solve the problem?
[twitter-dev] Re: json parsing code not workin..urgent plzzz!!!!!
$.ajax('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data,textdata) { var public_tweets = JSON.parse(data); if(public_tweets.text == 'true') { var newDiv = 'ptrue/p'; } $('#content').append(newDiv); }); },text); even this doesnt seem to work... On Jun 6, 12:33 pm, grand_unifier jijodasgu...@gmail.com wrote: !-- this is the javascript json parser function -- script type=text/javascript src=../jquery-1.2.6.min.js /script script type=text/javascript $(document).ready(function(){ $('form#search').bind(submit, function(e){ e.preventDefault(); $('#content').html(''); var query1 = urlencode($('input[name=user_a]').val ()); //userA var query2 = urlencode($('input [name=user_b]').val()); //userB var rpp = 20; //number of tweets to retrieve out $.getJSON('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data){ if(data == 'true'){ var newDiv = 'ptrue/p';} $('#content').append(newDiv); }); }); }) function urlencode(str) { return escape(str).replace(/\+/g,'%2B').replace(/%20/g, '+').replace(/\*/g, '%2A').replace(/\//g, '%2F').replace(/@/g, '%40'); } }) /script !-- javascript ends here -- this is a very basic code to check if two twitter users are friends or not the json will return true or false in'it but it does seem to work...its frustratin. anyone got any idea why???
[twitter-dev] json parsing problem...urgent
!-- this is the javascript json parser function -- script type=text/javascript src=../jquery-1.2.6.min.js /script script type=text/javascript $(document).ready(function() { $('form#search').bind(submit, function(e) { e.preventDefault(); $('#content').html(''); var query1 = urlencode($('input[name=user_a]').val ()); //userA var query2 = urlencode($('input [name=user_b]').val()); //userB $.ajax('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data,textdata) { if(data == 'false') { var newDiv = 'ptrue/p'; } $('#content').append(newDiv); }, 'text'); }); }) function urlencode(str) { return escape(str).replace(/\+/g,'%2B').replace(/%20/g, '+').replace(/\*/g, '%2A').replace(/\//g, '%2F').replace(/@/g, '%40'); } }) /script !-- javascript ends here -- doesnt seem to workany thoughts
[twitter-dev] Re: OAuth and 3rd party apps
Hi Peter. I've examined the same issue and as far as I can see, there is no way to do this. Further complicating the matter in your specific scenario is the fact that Twitpic does not use OAuth itself. :( If somebody knows of a way this could work, please correct me. -jonathan http://twitcaps.com On Jun 6, 2:00 pm, Petermdenton petermden...@gmail.com wrote: Hello, wondering if the following scenario is possible with oAuth. - I have mytwittersite.com - A user authenticates with twitter through oauth on my site. - user posts a pic to twitpic from my site using oauth tokens. With basic auth, very easy to do but it doesn't seem like I can do a single sign on that can execute commands to other third party apps
[twitter-dev] Change the link of the source [My App]
Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony
[twitter-dev] Re: Change the link of the source [My App]
Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony
[twitter-dev] Strange API behavior as of this morning
We noticed a direct message being published on the main timeline, and when deleting the post another post was deleted as well. Have there been any other reports of this kind of behavior today?
[twitter-dev] Re: Strange API behavior as of this morning
Ben, This is the first report of such behavior. Can you elaborate on what you are seeing? Can you reproduce this? Thanks, Doug On Mon, Jun 8, 2009 at 12:30 PM, benjackson bhjack...@gmail.com wrote: We noticed a direct message being published on the main timeline, and when deleting the post another post was deleted as well. Have there been any other reports of this kind of behavior today?
[twitter-dev] Streaming API + PHP and Python
Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or telnet to open a raw socket to /spritzer or /gardenhose it stays up forever. I had a telnet socket open on /spritzer all weekend with no disconnects... In the PHP script, if I add code to detect the time-outs and immediately disconnect the socket and reconnect, the updates start flowing in again... This is nice for error checking, but I'd really like to figure out a more robust solution. 1) Can anyone find anything wrong with the scripts I've posted? 2) Does anyone have an example PHP script they are using to connect to the Streaming API which stays up indefinitely? I would like to thank John K at Twitter for helping me debug thus far. Thanks, -Chad
[twitter-dev] Re: Streaming API + PHP and Python
A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or telnet to open a raw socket to /spritzer or /gardenhose it stays up forever. I had a telnet socket open on /spritzer all weekend with no disconnects... In the PHP script, if I add code to detect the time-outs and immediately disconnect the socket and reconnect, the updates start flowing in again... This is nice for error checking, but I'd really like to figure out a more robust solution. 1) Can anyone find anything wrong with the scripts I've posted? 2) Does anyone have an example PHP script they are using to connect to the Streaming API which stays up indefinitely? I would like to thank John K at Twitter for helping me debug thus far. Thanks, -Chad
[twitter-dev] Re: Streaming API + PHP and Python
I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or telnet to open a raw socket to /spritzer or /gardenhose it stays up forever. I had a telnet socket open on /spritzer all weekend with no disconnects... In the PHP script, if I add code to detect the time-outs and immediately disconnect the socket and reconnect, the updates start flowing in again... This is nice for error checking, but I'd really like to figure out a more robust solution. 1) Can anyone find anything wrong with the scripts I've posted? 2) Does anyone have an example PHP script they are using to connect to the Streaming API which stays up indefinitely? I would like to thank John K at Twitter for helping me debug thus far. Thanks, -Chad
[twitter-dev] Re: Change the link of the source [My App]
Don't know about Jony, but I haven't been able to figure out how to update my app's basic auth source param. Is there a way to do it? Trying to update twirssi to point to http://twirssi.com. Basically gave up on it, seems the only thing I can do is register a new source? Dan (@zigdon) On Mon, Jun 8, 2009 at 09:42, Doug Williams d...@twitter.com wrote: Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony -- Dan Boger
[twitter-dev] Twitter conference
Hi everybody, Is there any planned Twitter Meetup in the near future? Is there any official schedule available online? I heard about #140conf but I was imagining something more developers oriented. Something to discuss possible improvements, add-ons or similar stuffs about Twitter and Twitter apps. I was actually imagining planning a Web or Phone conference between Twitter admins, developers and perhaps some interesting users? In both cases, the conference could be recorded as a podcast to be redistributed if necessary. Mixing both telephone and Web conferencing for users who are not able to join the Web meeting is possible. About the Web meeting, it would be possible to support 2 or 3 concurrent video streams but the number of participants would be unlimited. Regarding the telephone conferencing, it could be possible to integrate Skype, VoIP, regular phone lines and a callback function for users who can't make international phone calls. What do you think? Emrah
[twitter-dev] HTTP basic authentication from AS3/Flex
Has anyone been able to use HTTP basic authentication from AS3/Flex (not AIR). It seems there's a limitation that the auth headers cannot be added to GET requests and we then get the browser login pop-up. The direct in URL authentication isn't support by IE. Ideas?
[twitter-dev] Re: Streaming API + PHP and Python
Hi Chad, We too have noticed the same behavior in PHP. Initially I wrote something very similar to your example, and noticed that I'd get a random time's worth of data before it disconnected. Then I rewrote it, which you can see at the below URL (modified to remove irrelevant code to this discussion), but I am still seeing similar results. Now it goes for 2-3 days, and then stops getting data. I can see that the script is still running via ps on the command line, and I can still see data going through the server, just PHP doesn't process it anymore. http://pastie.org/505012 I'd love to find out what is causing it. I do have a couple of theories specific to my code that I am trying - the only thing that sucks is that it is random, so the tests take a few minutes or days, depending on when it feels like dying. Let me know if this code works or helps you in any way. Feel free to bounce any ideas off of me, maybe we can come up with a stable solution. -Joel On Jun 8, 1:36 pm, Chad Etzel jazzyc...@gmail.com wrote: I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or telnet to open a raw socket to /spritzer or /gardenhose it stays up forever. I had a telnet socket open on /spritzer all weekend with no disconnects... In the PHP script, if I add code to detect the time-outs and immediately disconnect the socket and reconnect, the updates start flowing in again... This is nice for error checking, but I'd really like to figure out a more robust solution. 1) Can anyone find anything wrong with the scripts I've posted? 2) Does anyone have an example PHP script they are using to connect to the Streaming API which stays up indefinitely? I would like to thank John K at Twitter for helping me debug thus far. Thanks, -Chad
[twitter-dev] Re: Streaming API + PHP and Python
Well, glad I'm not the only one :) But still a bummer it's happening... Another strange thing is that his does *not* seem to happen with the /follow streams. I have a PHP script running (same source, just requesting /follow instead of /spritzer) that has been connected for over 2 days. Of course, it may die at any moment, I'm not sure.. One big difference is that the throughput for that stream is much much less than the /hose streams, and I'm wondering if the sheer volume of bytes being pushed has something to do with it? That would be quite sad. I have PHP scripts acting as Jabber/XMPP clients that use the similar fsockopen/fread/fgets/fwrite mechanisms that have been up for months at a time, so I know those socket connections *can* stay up a long long time in theory. -Chad On Mon, Jun 8, 2009 at 5:00 PM, jstrellner j...@twitturly.com wrote: Hi Chad, We too have noticed the same behavior in PHP. Initially I wrote something very similar to your example, and noticed that I'd get a random time's worth of data before it disconnected. Then I rewrote it, which you can see at the below URL (modified to remove irrelevant code to this discussion), but I am still seeing similar results. Now it goes for 2-3 days, and then stops getting data. I can see that the script is still running via ps on the command line, and I can still see data going through the server, just PHP doesn't process it anymore. http://pastie.org/505012 I'd love to find out what is causing it. I do have a couple of theories specific to my code that I am trying - the only thing that sucks is that it is random, so the tests take a few minutes or days, depending on when it feels like dying. Let me know if this code works or helps you in any way. Feel free to bounce any ideas off of me, maybe we can come up with a stable solution. -Joel On Jun 8, 1:36 pm, Chad Etzel jazzyc...@gmail.com wrote: I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or telnet to open a raw socket to /spritzer or /gardenhose it stays up forever. I had a telnet socket open on /spritzer all weekend with no disconnects... In the PHP script, if I add code to detect the time-outs and immediately disconnect the socket and reconnect, the updates start flowing in again... This is nice for error checking, but I'd really like to figure out a more robust solution. 1) Can anyone find anything wrong with the scripts I've posted? 2) Does anyone have an example PHP script they are using to connect to the Streaming API which stays up indefinitely? I would like to thank John K at Twitter for helping me debug thus far. Thanks, -Chad
[twitter-dev] Re: WWDC Twitter developer meetup at Twitter HQ: RSVP!
I'm coming, too. could someone provide the exact address for the meetup? thanks, -aj On Sun, Jun 7, 2009 at 10:35 AM, Mark Paine markpa...@gmail.com wrote: I'm in. -Mark On May 21, 2:18 pm, Alex Payne a...@twitter.com wrote: Hi all, There's great crossover between Twitter API developers and Mac/iPhone developers. Andrew Stone, developer of Twittelator Pro, suggested that we all get together during WWDC and coordinate around the Apple Push Notification Service and other issues of mutual interest. Twitter's offices are just a few blocks from Moscone, so it should be easy for any interested coders to make it over here. Please RSVP with a reply to this thread and let us know what dates and times work for you. Andrew was thinking early one morning, but not being much of a morning person, I'd prefer something later in the day. We'll let group consensus decide. Thanks, and hope to see you in early June. -- Alex Payne - API Lead, Twitter, Inc.http://twitter.com/al3x -- AJ Chen, PhD Co-Chair, Semantic Web SIG, sdforum.org Technical Architect, healthline.com http://web2express.org Palo Alto, CA
[twitter-dev] Re: Change the link of the source [My App]
If I'm reading the list correctly, the source param has been removed from basic auth. If you want to use a source param, you have to switch to OAuth. Might be misreading things though. On Mon, Jun 8, 2009 at 12:14, Dan Boger zig...@gmail.com wrote: Don't know about Jony, but I haven't been able to figure out how to update my app's basic auth source param. Is there a way to do it? Trying to update twirssi to point to http://twirssi.com. Basically gave up on it, seems the only thing I can do is register a new source? Dan (@zigdon) On Mon, Jun 8, 2009 at 09:42, Doug Williams d...@twitter.com wrote: Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony -- Dan Boger -- Internets. Serious business.
[twitter-dev] Re: Change the link of the source [My App]
Yep, I've tried to update the app website, and the website itself, nothing happens. I've tried it a few days ago (before the source problem at twitter) and after that too. But still seems to point to the old url and i don't know to which one is referring. Thanks, Jony On Mon, Jun 8, 2009 at 3:14 PM, Dan Boger zig...@gmail.com wrote: Don't know about Jony, but I haven't been able to figure out how to update my app's basic auth source param. Is there a way to do it? Trying to update twirssi to point to http://twirssi.com. Basically gave up on it, seems the only thing I can do is register a new source? Dan (@zigdon) On Mon, Jun 8, 2009 at 09:42, Doug Williams d...@twitter.com wrote: Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony -- Dan Boger
[twitter-dev] Re: Change the link of the source [My App]
It is for an oauth application. On Mon, Jun 8, 2009 at 1:42 PM, Doug Williams d...@twitter.com wrote: Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony
[twitter-dev] Re: Change the link of the source [My App]
Mine is with oauth, and the source param is been displayed okey. My particular issue is that I need to change the link that is pointing to that source to a new one. But it is not refreshed (of course in new tweets y post via my app). Thanks, Jony On Mon, Jun 8, 2009 at 6:16 PM, JDG ghil...@gmail.com wrote: If I'm reading the list correctly, the source param has been removed from basic auth. If you want to use a source param, you have to switch to OAuth. Might be misreading things though. On Mon, Jun 8, 2009 at 12:14, Dan Boger zig...@gmail.com wrote: Don't know about Jony, but I haven't been able to figure out how to update my app's basic auth source param. Is there a way to do it? Trying to update twirssi to point to http://twirssi.com. Basically gave up on it, seems the only thing I can do is register a new source? Dan (@zigdon) On Mon, Jun 8, 2009 at 09:42, Doug Williams d...@twitter.com wrote: Jony,Is this for an OAuth application or a basic auth source parameter? Thanks, Doug On Mon, Jun 8, 2009 at 8:45 AM, jonylt jon...@gmail.com wrote: Hi, I don't know what i'm making wrong. I have added an application with oauth interface to twitter, everything is working fine. But now i'm trying to change the link of the from [My App] without success. I've tried changing both fields in the application edition (Application Website, and Website itself) but nothing changed on the new posted tweets. Anybody can help me on this? Thanks, Jony -- Dan Boger -- Internets. Serious business.
[twitter-dev] Re: WWDC Twitter developer meetup at Twitter HQ: RSVP!
The office address is available at: https://admin.twitter.com/about#contact Thanks, Doug On Mon, Jun 8, 2009 at 10:24 AM, AJ Chen cano...@gmail.com wrote: I'm coming, too. could someone provide the exact address for the meetup? thanks, -aj On Sun, Jun 7, 2009 at 10:35 AM, Mark Paine markpa...@gmail.com wrote: I'm in. -Mark On May 21, 2:18 pm, Alex Payne a...@twitter.com wrote: Hi all, There's great crossover between Twitter API developers and Mac/iPhone developers. Andrew Stone, developer of Twittelator Pro, suggested that we all get together during WWDC and coordinate around the Apple Push Notification Service and other issues of mutual interest. Twitter's offices are just a few blocks from Moscone, so it should be easy for any interested coders to make it over here. Please RSVP with a reply to this thread and let us know what dates and times work for you. Andrew was thinking early one morning, but not being much of a morning person, I'd prefer something later in the day. We'll let group consensus decide. Thanks, and hope to see you in early June. -- Alex Payne - API Lead, Twitter, Inc.http://twitter.com/al3x -- AJ Chen, PhD Co-Chair, Semantic Web SIG, sdforum.org Technical Architect, healthline.com http://web2express.org Palo Alto, CA
[twitter-dev] Re: WWDC Twitter developer meetup at Twitter HQ: RSVP!
Sorry for the repost. The correct link to the address: http://twitter.com/about#contact https://admin.twitter.com/about#contact Thanks, Doug On Mon, Jun 8, 2009 at 10:24 AM, AJ Chen cano...@gmail.com wrote: I'm coming, too. could someone provide the exact address for the meetup? thanks, -aj On Sun, Jun 7, 2009 at 10:35 AM, Mark Paine markpa...@gmail.com wrote: I'm in. -Mark On May 21, 2:18 pm, Alex Payne a...@twitter.com wrote: Hi all, There's great crossover between Twitter API developers and Mac/iPhone developers. Andrew Stone, developer of Twittelator Pro, suggested that we all get together during WWDC and coordinate around the Apple Push Notification Service and other issues of mutual interest. Twitter's offices are just a few blocks from Moscone, so it should be easy for any interested coders to make it over here. Please RSVP with a reply to this thread and let us know what dates and times work for you. Andrew was thinking early one morning, but not being much of a morning person, I'd prefer something later in the day. We'll let group consensus decide. Thanks, and hope to see you in early June. -- Alex Payne - API Lead, Twitter, Inc.http://twitter.com/al3x -- AJ Chen, PhD Co-Chair, Semantic Web SIG, sdforum.org Technical Architect, healthline.com http://web2express.org Palo Alto, CA
[twitter-dev] Re: Twitter conference
They had one the other week: http://twtrcon.com/ -David Fisher @tibbon On Jun 8, 2:26 pm, Emrah e...@ekanet.net wrote: Hi everybody, Is there any planned Twitter Meetup in the near future? Is there any official schedule available online? I heard about #140conf but I was imagining something more developers oriented. Something to discuss possible improvements, add-ons or similar stuffs about Twitter and Twitter apps. I was actually imagining planning a Web or Phone conference between Twitter admins, developers and perhaps some interesting users? In both cases, the conference could be recorded as a podcast to be redistributed if necessary. Mixing both telephone and Web conferencing for users who are not able to join the Web meeting is possible. About the Web meeting, it would be possible to support 2 or 3 concurrent video streams but the number of participants would be unlimited. Regarding the telephone conferencing, it could be possible to integrate Skype, VoIP, regular phone lines and a callback function for users who can't make international phone calls. What do you think? Emrah
[twitter-dev] Re: json parsing code not workin..urgent plzzz!!!!!
Are you running this from a browser? If so, perhaps a same-origin policy error? You'll want to google it. -- Ed Finkler http://funkatron.com Twitter:@funkatron AIM: funka7ron ICQ: 3922133 XMPP:funkat...@gmail.com On Jun 8, 8:01 am, grand_unifier jijodasgu...@gmail.com wrote: $.ajax('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data,textdata) { var public_tweets = JSON.parse(data); if(public_tweets.text == 'true') { var newDiv = 'ptrue/p'; } $('#content').append(newDiv); }); },text); even this doesnt seem to work... On Jun 6, 12:33 pm, grand_unifier jijodasgu...@gmail.com wrote: !-- this is the javascript json parser function -- script type=text/javascript src=../jquery-1.2.6.min.js /script script type=text/javascript $(document).ready(function(){ $('form#search').bind(submit, function(e){ e.preventDefault(); $('#content').html(''); var query1 = urlencode($('input[name=user_a]').val ()); //userA var query2 = urlencode($('input [name=user_b]').val()); //userB var rpp = 20; //number of tweets to retrieve out $.getJSON('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data){ if(data == 'true'){ var newDiv = 'ptrue/p';} $('#content').append(newDiv); }); }); }) function urlencode(str) { return escape(str).replace(/\+/g,'%2B').replace(/%20/g, '+').replace(/\*/g, '%2A').replace(/\//g, '%2F').replace(/@/g, '%40'); } }) /script !-- javascript ends here -- this is a very basic code to check if two twitter users are friends or not the json will return true or false in'it but it does seem to work...its frustratin. anyone got any idea why???
[twitter-dev] Re: Twitter conference
Hi Emrah, I run the Twitter Developer Nest. It's a meet up for Twitter Developers. We usually meet in London but this month we're meeting in New York too. Details here: http://newyorkdevnest.eventbrite.com/ So far we have folks from TweetDeck, TwitterFeed, ABC News's Nightline Twitter Web Show, AudioBoo and Animoto signed up. Would be great to see a few more of you along. We have an open bar sponsored by Sun and there are 8 Show Tweet speaking slots in which conversations about just about anything Twitter API related can be discussed. Hope you can make it, Jon. On Mon, Jun 8, 2009 at 7:26 PM, Emrahe...@ekanet.net wrote: Hi everybody, Is there any planned Twitter Meetup in the near future? Is there any official schedule available online? I heard about #140conf but I was imagining something more developers oriented. Something to discuss possible improvements, add-ons or similar stuffs about Twitter and Twitter apps. I was actually imagining planning a Web or Phone conference between Twitter admins, developers and perhaps some interesting users? In both cases, the conference could be recorded as a podcast to be redistributed if necessary. Mixing both telephone and Web conferencing for users who are not able to join the Web meeting is possible. About the Web meeting, it would be possible to support 2 or 3 concurrent video streams but the number of participants would be unlimited. Regarding the telephone conferencing, it could be possible to integrate Skype, VoIP, regular phone lines and a callback function for users who can't make international phone calls. What do you think? Emrah -- Jonathan Markwell Engineer | Founder | Connector Inuda Innovations Ltd, Brighton, UK Web application development support Twitter Facebook integration specialists http://inuda.com Organising the world's first events for the Twitter developer Community http://TwitterDeveloperNest.com Providing a nice little place to work in the middle of Brighton - http://theskiff.org Measuring your brand's visibility on the social web - http://HowSociable.com mob: 07766 021 485 | tel: 01273 704 549 | fax: 01273 376 953 skype: jlmarkwell | twitter: http://twitter.com/JonMarkwell
[twitter-dev] Re: Revoke/Destroy Access Method?
Sorry about the late reply... I mean if the user revokes access from my site, not knowing that he/ she should go through yours? Or should I just direct users to the connections page to revoke access directly from your site? On Jun 5, 8:22 am, Abraham Williams 4bra...@gmail.com wrote: Why would you need to destroy the access keys? They stop working once the user revokes access. I guess you could delete them from your database. On Fri, Jun 5, 2009 at 06:44, fastest963 fastest...@gmail.com wrote: So user deletes his/her twitter access from my site, as in she says I don't want to give this app access anymore so on my site she clicks remove twitter access/account. Is there a method I can call to destroy the access keys I received and automatically revoke access to my application? I know for Facebook, it is a rule, you MUST destroy the keys and disconnect the user if they cancel access, but I don't see this anywhere on Twitters documentation. Thanks, @fastest963 -- Abraham Williams |http://the.hackerconundrum.com Hacker |http://abrah.am|http://twitter.com/abraham Project |http://fireeagle.labs.poseurtech.com This email is: [ ] blogable [x] ask first [ ] private.
[twitter-dev] Re: Twitter conference
There is the Twitter Developer Nest in New York and London in the next couple of weeks: http://twitterdevelopernest.com/2009/06/ncy-ldn-twitter-developer-nests/ I went to the first two London ones and they were fun and interesting, they both included a QA session with Doug Williams over Skype too. That might be what you're looking for if you're in either area. Phil -- Phil Nash Twitter: http://twitter.com/philnash Find some music: http://yournextfavband.com Web development: http://www.unintentionallyblank.co.uk On Mon, Jun 8, 2009 at 7:26 PM, Emrah e...@ekanet.net wrote: Hi everybody, Is there any planned Twitter Meetup in the near future? Is there any official schedule available online? I heard about #140conf but I was imagining something more developers oriented. Something to discuss possible improvements, add-ons or similar stuffs about Twitter and Twitter apps. I was actually imagining planning a Web or Phone conference between Twitter admins, developers and perhaps some interesting users? In both cases, the conference could be recorded as a podcast to be redistributed if necessary. Mixing both telephone and Web conferencing for users who are not able to join the Web meeting is possible. About the Web meeting, it would be possible to support 2 or 3 concurrent video streams but the number of participants would be unlimited. Regarding the telephone conferencing, it could be possible to integrate Skype, VoIP, regular phone lines and a callback function for users who can't make international phone calls. What do you think? Emrah
[twitter-dev] Re: json parsing code not workin..urgent plzzz!!!!!
On Mon, Jun 8, 2009 at 6:01 PM, grand_unifierjijodasgu...@gmail.com wrote: $.ajax('http://twitter.com/friendships/exists.json? user_a='+query1+'user_b='+query2, function(data,textdata) { var public_tweets = JSON.parse(data); if(public_tweets.text == 'true') { var newDiv = 'ptrue/p'; } $('#content').append(newDiv); }); },text); This code will throw cross-domain security exception. The url http://twitter.com/friendships/exists.json can only be called by ajax from twitter.com. Not from any other domain. But you can use it in this way, script type=text/javascript src=http://twitter.com/friendships/exists.json;/script If this url supports callback parameter, specify a callback. It'll work almost like ajax. -- A K M Mokaddim http://talk.cmyweb.net http://twitter.com/shiplu Stop Top Posting !! বাংলিশ লেখার চাইতে বাংলা লেখা অনেক ভাল Sent from Dhaka, Bangladesh
[twitter-dev] How's SuperChirp managing to work around DM limits
The post limits make it nearly impossible to send messages to a large follower group. SuperChirp recently launched and I was wondering if they have any special arrangements with Twitter to do so. Just looking for some transparency on this. I'm spending a lot of time building out an application and something as trivial as special privileges could render all my time/effort useless.
[twitter-dev] Re: How's SuperChirp managing to work around DM limits
There is nothing in our system beyond normal whitelisting which increase direct messaging limits. Looking at the @SocialChirp account they are still a relatively new application and have likely not run into the messaging limits you are thinking of. Thanks, Doug On Mon, Jun 8, 2009 at 4:37 PM, jmathai jmat...@gmail.com wrote: The post limits make it nearly impossible to send messages to a large follower group. SuperChirp recently launched and I was wondering if they have any special arrangements with Twitter to do so. Just looking for some transparency on this. I'm spending a lot of time building out an application and something as trivial as special privileges could render all my time/effort useless.
[twitter-dev] Re: How's SuperChirp managing to work around DM limits
Thanks Doug. However, your response did confuse my understanding of whitelisting (seems to be shared among others). My understanding was that whitelisting only applies to GET calls which does not include direct messaging. I thought POSTs were not affected by whitelisting. You replied to me earlier saying: Update limits are applied on a per-user basis regardless of whitelisting status. Context: http://groups.google.com/group/twitter-development-talk/msg/f1b4be8203f1?hl=en Not sure why I (and others) are having trouble understanding the limits. A similar question from someone else (not yet replied to): http://groups.google.com/group/twitter-development-talk/browse_thread/thread/b21a7a637006227a Confused :) On Jun 8, 4:49 pm, Doug Williams d...@twitter.com wrote: There is nothing in our system beyond normal whitelisting which increase direct messaging limits. Looking at the @SocialChirp account they are still a relatively new application and have likely not run into the messaging limits you are thinking of. Thanks, Doug On Mon, Jun 8, 2009 at 4:37 PM, jmathai jmat...@gmail.com wrote: The post limits make it nearly impossible to send messages to a large follower group. SuperChirp recently launched and I was wondering if they have any special arrangements with Twitter to do so. Just looking for some transparency on this. I'm spending a lot of time building out an application and something as trivial as special privileges could render all my time/effort useless.
[twitter-dev] Re: Revoke/Destroy Access Method?
I would just say delete the access tokens from your database and call it good. If they care that much they can figure the connections page on their own. On Mon, Jun 8, 2009 at 18:21, fastest963 fastest...@gmail.com wrote: Sorry about the late reply... I mean if the user revokes access from my site, not knowing that he/ she should go through yours? Or should I just direct users to the connections page to revoke access directly from your site? On Jun 5, 8:22 am, Abraham Williams 4bra...@gmail.com wrote: Why would you need to destroy the access keys? They stop working once the user revokes access. I guess you could delete them from your database. On Fri, Jun 5, 2009 at 06:44, fastest963 fastest...@gmail.com wrote: So user deletes his/her twitter access from my site, as in she says I don't want to give this app access anymore so on my site she clicks remove twitter access/account. Is there a method I can call to destroy the access keys I received and automatically revoke access to my application? I know for Facebook, it is a rule, you MUST destroy the keys and disconnect the user if they cancel access, but I don't see this anywhere on Twitters documentation. Thanks, @fastest963 -- Abraham Williams |http://the.hackerconundrum.com Hacker |http://abrah.am|http://twitter.com/abraham Project |http://fireeagle.labs.poseurtech.com This email is: [ ] blogable [x] ask first [ ] private. -- Abraham Williams | Community | http://web608.org Hacker | http://abrah.am | http://twitter.com/abraham Project | http://fireeagle.labs.poseurtech.com This email is: [ ] blogable [x] ask first [ ] private.
[twitter-dev] Re: Streaming API + PHP and Python
Here is some rough python code that I quickly wrote last weekend to handle the json spritzer feed: http://gist.github.com/126173 During the 3 or so days that I ran it, I didn't notice it die at any time... Jason Emerick The information transmitted (including attachments) is covered by the Electronic Communications Privacy Act, 18 U.S.C. 2510-2521, is intended only for the person(s) or entity/entities to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. On Mon, Jun 8, 2009 at 5:25 PM, Chad Etzel jazzyc...@gmail.com wrote: Well, glad I'm not the only one :) But still a bummer it's happening... Another strange thing is that his does *not* seem to happen with the /follow streams. I have a PHP script running (same source, just requesting /follow instead of /spritzer) that has been connected for over 2 days. Of course, it may die at any moment, I'm not sure.. One big difference is that the throughput for that stream is much much less than the /hose streams, and I'm wondering if the sheer volume of bytes being pushed has something to do with it? That would be quite sad. I have PHP scripts acting as Jabber/XMPP clients that use the similar fsockopen/fread/fgets/fwrite mechanisms that have been up for months at a time, so I know those socket connections *can* stay up a long long time in theory. -Chad On Mon, Jun 8, 2009 at 5:00 PM, jstrellner j...@twitturly.com wrote: Hi Chad, We too have noticed the same behavior in PHP. Initially I wrote something very similar to your example, and noticed that I'd get a random time's worth of data before it disconnected. Then I rewrote it, which you can see at the below URL (modified to remove irrelevant code to this discussion), but I am still seeing similar results. Now it goes for 2-3 days, and then stops getting data. I can see that the script is still running via ps on the command line, and I can still see data going through the server, just PHP doesn't process it anymore. http://pastie.org/505012 I'd love to find out what is causing it. I do have a couple of theories specific to my code that I am trying - the only thing that sucks is that it is random, so the tests take a few minutes or days, depending on when it feels like dying. Let me know if this code works or helps you in any way. Feel free to bounce any ideas off of me, maybe we can come up with a stable solution. -Joel On Jun 8, 1:36 pm, Chad Etzel jazzyc...@gmail.com wrote: I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All, I am stumped. For several days I have tried to write a simple PHP script that can interact with the Streaming API by just pulling in the stream and echoing out the contents. This is the script I have: http://pastie.org/private/9owdxerouwhitz5nfacrw Right now it just pulls in the feed and echos it. I am not parsing anything at the moment. This works great for a while, then the fread will start timing out every 60 seconds (I have set the stream_timeout to 60). It will do this after an undeterministic number of updates or number of bytes received. netstat shows I am still connected to stream.twitter.com but Wireshark shows that no new data is arriving. I have tried this on 3 different machines (2 behind the same NAT/firewall, and 1 remote server) all with the same results. I even scraped together a simple python script which should do the same thing here: http://pastie.org/private/k0p5286ljlhdyurlagnq Same results works for a while, then it stops. Strangely, if I use CURL or
[twitter-dev] Re: How's SuperChirp managing to work around DM limits
Sorry for the confusion. It's completely understandable. I was referring to public tweets when I said update limits so I should have been more clear. Regarding direct messages, we recently augmented whitelisting to increase DM limits because the number of requests we were receiving. We're mum on the exact number for the same reason we do not give an explicit number for search rate limits as it encourages judicious use. If you have a specific use case where it would be beneficial to better understand these limits, please feel free to contact me off list. Also, if you have a great and innovative idea for using DMs where the limits may be a problem, please get in touch. I'd be interested to learn about some of the use-cases we may not think of as we discuss these policies internally. Thanks, Doug On Mon, Jun 8, 2009 at 5:14 PM, jmathai jmat...@gmail.com wrote: Thanks Doug. However, your response did confuse my understanding of whitelisting (seems to be shared among others). My understanding was that whitelisting only applies to GET calls which does not include direct messaging. I thought POSTs were not affected by whitelisting. You replied to me earlier saying: Update limits are applied on a per-user basis regardless of whitelisting status. Context: http://groups.google.com/group/twitter-development-talk/msg/f1b4be8203f1?hl=en Not sure why I (and others) are having trouble understanding the limits. A similar question from someone else (not yet replied to): http://groups.google.com/group/twitter-development-talk/browse_thread/thread/b21a7a637006227a Confused :) On Jun 8, 4:49 pm, Doug Williams d...@twitter.com wrote: There is nothing in our system beyond normal whitelisting which increase direct messaging limits. Looking at the @SocialChirp account they are still a relatively new application and have likely not run into the messaging limits you are thinking of. Thanks, Doug On Mon, Jun 8, 2009 at 4:37 PM, jmathai jmat...@gmail.com wrote: The post limits make it nearly impossible to send messages to a large follower group. SuperChirp recently launched and I was wondering if they have any special arrangements with Twitter to do so. Just looking for some transparency on this. I'm spending a lot of time building out an application and something as trivial as special privileges could render all my time/effort useless.
[twitter-dev] Re: Streaming API + PHP and Python
Hi Jason, Thanks! I've tried it out, and it seems that it doesn't like unicode characters? Here's the traceback I get: Exception in thread Thread-2: Traceback (most recent call last): File /usr/lib/python2.5/threading.py, line 486, in __bootstrap_inner self.run() File spritzer.py, line 31, in run print '%s -- %s' % (t['user']['screen_name'], t['text']) UnicodeEncodeError: 'ascii' codec can't encode characters in position 11-14: ordinal not in range(128) Exception in thread Thread-1: Traceback (most recent call last): File /usr/lib/python2.5/threading.py, line 486, in __bootstrap_inner self.run() File spritzer.py, line 31, in run print '%s -- %s' % (t['user']['screen_name'], t['text']) UnicodeEncodeError: 'ascii' codec can't encode character u'\u2625' in position 9: ordinal not in range(128) I'm not fluent in python, so I'm not sure of the unicode capabilites... but otherwise it looks like it's connecting and receiving data. -Chad On Mon, Jun 8, 2009 at 8:36 PM, Jason Emerick jemer...@gmail.com wrote: Here is some rough python code that I quickly wrote last weekend to handle the json spritzer feed: http://gist.github.com/126173 During the 3 or so days that I ran it, I didn't notice it die at any time... Jason Emerick The information transmitted (including attachments) is covered by the Electronic Communications Privacy Act, 18 U.S.C. 2510-2521, is intended only for the person(s) or entity/entities to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. On Mon, Jun 8, 2009 at 5:25 PM, Chad Etzel jazzyc...@gmail.com wrote: Well, glad I'm not the only one :) But still a bummer it's happening... Another strange thing is that his does *not* seem to happen with the /follow streams. I have a PHP script running (same source, just requesting /follow instead of /spritzer) that has been connected for over 2 days. Of course, it may die at any moment, I'm not sure.. One big difference is that the throughput for that stream is much much less than the /hose streams, and I'm wondering if the sheer volume of bytes being pushed has something to do with it? That would be quite sad. I have PHP scripts acting as Jabber/XMPP clients that use the similar fsockopen/fread/fgets/fwrite mechanisms that have been up for months at a time, so I know those socket connections *can* stay up a long long time in theory. -Chad On Mon, Jun 8, 2009 at 5:00 PM, jstrellner j...@twitturly.com wrote: Hi Chad, We too have noticed the same behavior in PHP. Initially I wrote something very similar to your example, and noticed that I'd get a random time's worth of data before it disconnected. Then I rewrote it, which you can see at the below URL (modified to remove irrelevant code to this discussion), but I am still seeing similar results. Now it goes for 2-3 days, and then stops getting data. I can see that the script is still running via ps on the command line, and I can still see data going through the server, just PHP doesn't process it anymore. http://pastie.org/505012 I'd love to find out what is causing it. I do have a couple of theories specific to my code that I am trying - the only thing that sucks is that it is random, so the tests take a few minutes or days, depending on when it feels like dying. Let me know if this code works or helps you in any way. Feel free to bounce any ideas off of me, maybe we can come up with a stable solution. -Joel On Jun 8, 1:36 pm, Chad Etzel jazzyc...@gmail.com wrote: I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the number of bytes waiting in the local TCP buffer. Baseless speculation: There's a limitation in the chunked transfer coding in the PHP client wherein it cannot support endless streams. Some resource is exhausted or administratively limited (php.ini), and the stream stops. -John Kalucki Services, Twitter Inc. On Jun 8, 1:16 pm, Chad Etzel jazzyc...@gmail.com wrote: Hi All,
[twitter-dev] Re: Larger Users Not Returning Follower Data
Doug, et. al., here's the problem(s) I'm running into. By forcing me to use paging for followers/ids and friends/ids, for someone like BritneySpears I now have to make over 350 requests to get through all her followers. Now I'm having huge rate limit issues because of that, not to mention how long it takes to get through the entire list. Would it be possible to set a number to specify how many user ids are returned per page so I don't have to make so many requests? In addition, I'm finding the pages aren't returning consistent data. Some are returning less than 5,000 results, and some aren't even returning data that should. So even with Paging I'm still unable to get through all of @britneyspears' followers. Any suggestions? @Jesse On Thu, Jun 4, 2009 at 12:26 AM, Doug Williams d...@twitter.com wrote: I've heard that list sizes greater than 150K-200K start to return timeouts at higher rates. Although I'd enjoy hearing first-hand experiences and recommendations. Thanks, Doug On Wed, Jun 3, 2009 at 9:19 PM, Jesse Stay jesses...@gmail.com wrote: In my case specifically it's the Social Graph methods. I didn't realize you had paging available now. Is there some logic as to when I should expect to page and when I can just rely on the full result? Jesse On Wed, Jun 3, 2009 at 9:56 PM, Doug Williams d...@twitter.com wrote: What methods in particular are you referring to? The social graph methods now support paging so retrieving all of that data is now possible, where it used to throw 502s. It does however require a bit of application logic to assume when paging is necessary (e.g. large follower counts). Additionally, we are making changes to the databases which cause latency that result in periodic 502s. We are not able to give definitive ETAs on these fixes due to priorities that change as unforeseeable critical needs arise. More specificity would be beneficial. Do you have a replaceable bug, problem, or suggestion that you would like to discuss? Thanks, Doug On Wed, Jun 3, 2009 at 7:22 PM, Jesse Stay jesses...@gmail.com wrote: I was discussing this with Iain, and have also talked about it with Damon, so I know I'm not alone in this. I am having huge issues retrieving follower and friend data for the larger users (1 million+ followers), most of the time returning 502 Bad Gateway errors. I know there are a few of these users getting really frustrated about our apps not being able to retrieve data for them. Is there a plan to fix this? Is the API team aware of this? Any ETA by chance? Thanks, @Jesse
[twitter-dev] Re: Streaming API + PHP and Python
Try calling encode(utf-8) on the strings before you do anything else with them but when you do, you may find that you have to add Python components. In other words, if the string is foo, do this: foo = foo.encode(utf-8) Nick On Mon, Jun 8, 2009 at 4:52 PM, Chad Etzel jazzyc...@gmail.com wrote: Hi Jason, Thanks! I've tried it out, and it seems that it doesn't like unicode characters? Here's the traceback I get: Exception in thread Thread-2: Traceback (most recent call last): File /usr/lib/python2.5/threading.py, line 486, in __bootstrap_inner self.run() File spritzer.py, line 31, in run print '%s -- %s' % (t['user']['screen_name'], t['text']) UnicodeEncodeError: 'ascii' codec can't encode characters in position 11-14: ordinal not in range(128) Exception in thread Thread-1: Traceback (most recent call last): File /usr/lib/python2.5/threading.py, line 486, in __bootstrap_inner self.run() File spritzer.py, line 31, in run print '%s -- %s' % (t['user']['screen_name'], t['text']) UnicodeEncodeError: 'ascii' codec can't encode character u'\u2625' in position 9: ordinal not in range(128) I'm not fluent in python, so I'm not sure of the unicode capabilites... but otherwise it looks like it's connecting and receiving data. -Chad On Mon, Jun 8, 2009 at 8:36 PM, Jason Emerick jemer...@gmail.com wrote: Here is some rough python code that I quickly wrote last weekend to handle the json spritzer feed: http://gist.github.com/126173 During the 3 or so days that I ran it, I didn't notice it die at any time... Jason Emerick The information transmitted (including attachments) is covered by the Electronic Communications Privacy Act, 18 U.S.C. 2510-2521, is intended only for the person(s) or entity/entities to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. On Mon, Jun 8, 2009 at 5:25 PM, Chad Etzel jazzyc...@gmail.com wrote: Well, glad I'm not the only one :) But still a bummer it's happening... Another strange thing is that his does *not* seem to happen with the /follow streams. I have a PHP script running (same source, just requesting /follow instead of /spritzer) that has been connected for over 2 days. Of course, it may die at any moment, I'm not sure.. One big difference is that the throughput for that stream is much much less than the /hose streams, and I'm wondering if the sheer volume of bytes being pushed has something to do with it? That would be quite sad. I have PHP scripts acting as Jabber/XMPP clients that use the similar fsockopen/fread/fgets/fwrite mechanisms that have been up for months at a time, so I know those socket connections *can* stay up a long long time in theory. -Chad On Mon, Jun 8, 2009 at 5:00 PM, jstrellner j...@twitturly.com wrote: Hi Chad, We too have noticed the same behavior in PHP. Initially I wrote something very similar to your example, and noticed that I'd get a random time's worth of data before it disconnected. Then I rewrote it, which you can see at the below URL (modified to remove irrelevant code to this discussion), but I am still seeing similar results. Now it goes for 2-3 days, and then stops getting data. I can see that the script is still running via ps on the command line, and I can still see data going through the server, just PHP doesn't process it anymore. http://pastie.org/505012 I'd love to find out what is causing it. I do have a couple of theories specific to my code that I am trying - the only thing that sucks is that it is random, so the tests take a few minutes or days, depending on when it feels like dying. Let me know if this code works or helps you in any way. Feel free to bounce any ideas off of me, maybe we can come up with a stable solution. -Joel On Jun 8, 1:36 pm, Chad Etzel jazzyc...@gmail.com wrote: I thought those things, too... but the following things made me think otherwise: a) The stream stops after a different number of updates/bytes each time, and will happily go on forever if I put an error-catching loop in the script. b) The same thing is happening in the python script. c) Curl/telnet works fine, so it's not a system resource depletion issue ...still confused, -Chad On Mon, Jun 8, 2009 at 4:31 PM, John Kalucki jkalu...@gmail.com wrote: A theory: The PHP client has stopped reading data, for whatever reason. The TCP buffers fill on the client host, the TCP window closes, and wireshark shows no data flowing. netstat(1) will show the
[twitter-dev] oAuth access_token fails on second try
Hello, I am very new to using the oAuth and twitter API and need help with this error 'The remote server returned an error: (401) Unauthorized' when I try to use the access_token using the token that I have just received. The code works fine in connecting to twitter and allowing the user to login and granting access to my app and creating the token. Since I am still in the development phase I dont have a callback URL so once I get the token I copy and paste it in manually in the code to test. I then run my application and I am able to successfully access the token the very first time and update my status as well but all subsequent calls to access the token results in the error I mentioned previously. Any help would be greatly appreciated. I am posting the webrequest that is being made: http://twitter.com/oauth/access_token?oauth_consumer_key=keyoauth_nonce=2013769oauth_signature_method=HMAC-SHA1oauth_timestamp=1244518302oauth_token=pNRjCKkrMUmSGFA5zKgE4BkuhxVlWXvnPN3Gpooauth_version=1.0oauth_signature=HPDE7R1%2f68OdGtTuJsMkVS2wVEg%3d Thanks,
[twitter-dev] How to get all messages between a start and end date
I am trying to get messages between a start and end date. I see that there is since and untill that does some part of it. But there is small issue with it. Search API limits number of messages returned to 1500. So if a search query is very popular, I will get 1500 messages that may just span 2-3 hours (if lucky :-)). Then there is another since_id. Again if i use that, it returns the results starting from latest. And as far as i can see since and untill does not take time part into consideration. Is there some trick or some API that i can use to get it done? Thanks
[twitter-dev] Re: How does the trends method in search API work?
See if these examples help. http://www.byteblocks.com/post/2009/05/21/Twitter-Daily-Trends-Using-Tweetersharp-API.aspx http://www.byteblocks.com/post/2009/05/22/Get-Weekly-Twitter-Trends.aspx On Jun 8, 1:18 am, zvn zvn750...@gmail.com wrote: Hi, I'm a master student in Taiwan. I want to do some work to improve the trends API inhttp://apiwiki.twitter.com/Twitter-Search-API-Method%3A-trends. But I didn't know how does this api work, so it's hard for me to make a comparison between my method and the API, would any doby please tell me how to solve the problem?
[twitter-dev] Re: Saved Searches API and Geo info
See if this helps.. http://www.backyardtweets.com/ On Jun 5, 1:32 pm, Sean P. seantpa...@gmail.com wrote: Thank you for providing the saved searches API, but I do have a bit of a question. I haven't yet played with it, but I am wondering about the use of location-based searching in the saved search APIs. If this can still be performed, is it like the query performed on search.twitter.com's advanced search page (with the near:Los Angeles, CA 15mi query parameter), or can I continue to use the geo param as in the search API? Thanks!
[twitter-dev] Re: Larger Users Not Returning Follower Data
I'll share are a few other wrinkles I've observed in the paged friends / followers API responses: - There are often more (sometime many more) pages than you'd expect based on the follower/friend counts listed in the profile - Pages in the middle of a long series of output pages usually but not always return 5000 elements - The only reliable way to determine the end of available output pages is to keep asking until an empty JSON response [] is returned - Friends / follow pages occasionally contain some duplicate entries from one page to another On the positive side, the paged output methods are *far* more reliable for high friend/follow lists than the old ones, when the list is long. The old methods were more convenient but only worked a small fraction of the time when the list was very long. I have test data for britneyspears from a few days ago, at that time the profile said 1,644,227, which would imply around 328 pages. There were actually 356 pages, containing 1,772,771 entries. I just looked at the profile page now and it says 1,745,417, which would still suggest 349 pages of data, less than what is actually there. And several of the pages return less than 5000 entries, so you can't assume that an unfilled page represents the end of the data. On Jun 8, 6:19 pm, Jesse Stay jesses...@gmail.com wrote: Doug, et. al., here's the problem(s) I'm running into. By forcing me to use paging for followers/ids and friends/ids, for someone like BritneySpears I now have to make over 350 requests to get through all her followers. Now I'm having huge rate limit issues because of that, not to mention how long it takes to get through the entire list. Would it be possible to set a number to specify how many user ids are returned per page so I don't have to make so many requests? In addition, I'm finding the pages aren't returning consistent data. Some are returning less than 5,000 results, and some aren't even returning data that should. So even with Paging I'm still unable to get through all of @britneyspears' followers. Any suggestions? @Jesse On Thu, Jun 4, 2009 at 12:26 AM, Doug Williams d...@twitter.com wrote: I've heard that list sizes greater than 150K-200K start to return timeouts at higher rates. Although I'd enjoy hearing first-hand experiences and recommendations. Thanks, Doug On Wed, Jun 3, 2009 at 9:19 PM, Jesse Stay jesses...@gmail.com wrote: In my case specifically it's the Social Graph methods. I didn't realize you had paging available now. Is there some logic as to when I should expect to page and when I can just rely on the full result? Jesse On Wed, Jun 3, 2009 at 9:56 PM, Doug Williams d...@twitter.com wrote: What methods in particular are you referring to? The social graph methods now support paging so retrieving all of that data is now possible, where it used to throw 502s. It does however require a bit of application logic to assume when paging is necessary (e.g. large follower counts). Additionally, we are making changes to the databases which cause latency that result in periodic 502s. We are not able to give definitive ETAs on these fixes due to priorities that change as unforeseeable critical needs arise. More specificity would be beneficial. Do you have a replaceable bug, problem, or suggestion that you would like to discuss? Thanks, Doug On Wed, Jun 3, 2009 at 7:22 PM, Jesse Stay jesses...@gmail.com wrote: I was discussing this with Iain, and have also talked about it with Damon, so I know I'm not alone in this. I am having huge issues retrieving follower and friend data for the larger users (1 million+ followers), most of the time returning 502 Bad Gateway errors. I know there are a few of these users getting really frustrated about our apps not being able to retrieve data for them. Is there a plan to fix this? Is the API team aware of this? Any ETA by chance? Thanks, @Jesse
[twitter-dev] Re: Larger Users Not Returning Follower Data
Jesse, Please submit an issue if you feel that this would contribute to the community. There are issues for paging bugs with the social graph methods so star them appropriately. I have some questions to the community at large using the social graph methods so please feel free to chime in: What is your caching scheme? How dependent is your data on being real time and why? What type of value are you generating from this data? What is your use case? There is interest in popular users' social graphs but from what I've seen they are rather edge case in terms of the value the contribute back to the community. A valuable use-case outside of a very specific need would help in prioritizing requests like this. I'm trying to understand where you (the community, not just Jesse) are generating value from these large follower lists so please feel free to chime in if you are doing projects on top popular users. Thanks, Doug On Mon, Jun 8, 2009 at 6:19 PM, Jesse Stay jesses...@gmail.com wrote: Doug, et. al., here's the problem(s) I'm running into. By forcing me to use paging for followers/ids and friends/ids, for someone like BritneySpears I now have to make over 350 requests to get through all her followers. Now I'm having huge rate limit issues because of that, not to mention how long it takes to get through the entire list. Would it be possible to set a number to specify how many user ids are returned per page so I don't have to make so many requests? In addition, I'm finding the pages aren't returning consistent data. Some are returning less than 5,000 results, and some aren't even returning data that should. So even with Paging I'm still unable to get through all of @britneyspears' followers. Any suggestions? @Jesse On Thu, Jun 4, 2009 at 12:26 AM, Doug Williams d...@twitter.com wrote: I've heard that list sizes greater than 150K-200K start to return timeouts at higher rates. Although I'd enjoy hearing first-hand experiences and recommendations. Thanks, Doug On Wed, Jun 3, 2009 at 9:19 PM, Jesse Stay jesses...@gmail.com wrote: In my case specifically it's the Social Graph methods. I didn't realize you had paging available now. Is there some logic as to when I should expect to page and when I can just rely on the full result? Jesse On Wed, Jun 3, 2009 at 9:56 PM, Doug Williams d...@twitter.com wrote: What methods in particular are you referring to? The social graph methods now support paging so retrieving all of that data is now possible, where it used to throw 502s. It does however require a bit of application logic to assume when paging is necessary (e.g. large follower counts). Additionally, we are making changes to the databases which cause latency that result in periodic 502s. We are not able to give definitive ETAs on these fixes due to priorities that change as unforeseeable critical needs arise. More specificity would be beneficial. Do you have a replaceable bug, problem, or suggestion that you would like to discuss? Thanks, Doug On Wed, Jun 3, 2009 at 7:22 PM, Jesse Stay jesses...@gmail.com wrote: I was discussing this with Iain, and have also talked about it with Damon, so I know I'm not alone in this. I am having huge issues retrieving follower and friend data for the larger users (1 million+ followers), most of the time returning 502 Bad Gateway errors. I know there are a few of these users getting really frustrated about our apps not being able to retrieve data for them. Is there a plan to fix this? Is the API team aware of this? Any ETA by chance? Thanks, @Jesse
[twitter-dev] Re: oAuth access_token fails on second try
For each user make only one OAuth call to oauth/access_token. The returned token from that should be saved by you and used directly on all future normal API calls. On Mon, Jun 8, 2009 at 22:54, Nizar niza...@gmail.com wrote: Hello, I am very new to using the oAuth and twitter API and need help with this error 'The remote server returned an error: (401) Unauthorized' when I try to use the access_token using the token that I have just received. The code works fine in connecting to twitter and allowing the user to login and granting access to my app and creating the token. Since I am still in the development phase I dont have a callback URL so once I get the token I copy and paste it in manually in the code to test. I then run my application and I am able to successfully access the token the very first time and update my status as well but all subsequent calls to access the token results in the error I mentioned previously. Any help would be greatly appreciated. I am posting the webrequest that is being made: http://twitter.com/oauth/access_token?oauth_consumer_key= keyoauth_nonce=2013769oauth_signature_method=HMAC-SHA1oauth_timestamp=1244518302oauth_token=pNRjCKkrMUmSGFA5zKgE4BkuhxVlWXvnPN3Gpooauth_version=1.0oauth_signature=HPDE7R1%2f68OdGtTuJsMkVS2wVEg%3d Thanks, -- Abraham Williams | Community | http://web608.org Hacker | http://abrah.am | http://twitter.com/abraham Project | http://fireeagle.labs.poseurtech.com This email is: [ ] blogable [x] ask first [ ] private. Sent from Madison, WI, United States