[twitter-dev] Re: Developer Relations and the Twitter Platform

2011-04-26 Thread Joel Strellner
Welcome aboard Jason.  I'm looking forward to hearing about the event
and to working with you in the future.

-Joel

On Apr 26, 10:20 am, Jason Costa jasonco...@twitter.com wrote:
 Hi all,

 My name is Jason Costa, and I joined Twitter last week to take on
 the role of Developer Relations Manager. I'll be 100% focused on
 ensuring the best possible developer experience for those looking
 to build on the Twitter Platform.

 Fostering the developer ecosystem is extremely important to
 Twitter, and we're committed to seeing that ecosystem grow and
 thrive. My role will be less involved with day to day support, and
 more on ensuring a healthy developer community and improved
 communication. I'll be working closely with Ryan, Taylor, Matt,
 and Arnaud to make that a reality.

 If you have feedback or questions, please feel free to send me an
 email anytime: jasonco...@twitter.com, or you can find me on
 Twitter as @jasoncosta. I'd love to hear from you.

 We'll be holding a developer-focused event at the Twitter
 headquarters in a few weeks, so keep an eye out for a registration
 announcement here later this week.

 Thanks,

 --Jason

-- 
Twitter developer documentation and resources: http://dev.twitter.com/doc
API updates via Twitter: http://twitter.com/twitterapi
Issues/Enhancements Tracker: http://code.google.com/p/twitter-api/issues/list
Change your membership to this group: 
http://groups.google.com/group/twitter-development-talk


[twitter-dev] Re: Streaming Api - Keywords matched

2009-11-05 Thread Joel Strellner
A little late to this convo, but I disagree with the need for this feature.
It adds extra complexity to twitter that really should be on the application
level, and, since the streaming API only returns one tweet, even if it
matched two or more keywords that you are watching, it'd add extra load on
them so they can determine all of the matches.

It'd be nice for them to do this, but honestly, I feel this is an
application specific need, and thus, the application should implement it.

-1

-Joel


On Tue, Nov 3, 2009 at 3:01 PM, John Kalucki jkalu...@gmail.com wrote:


 The Streaming API and the Search indexer both tee off the same point
 in the new status event pipeline. New statuses are born in the web
 containers and queued for a cluster of processes that begin the
 offline processing pipeline. This first process does many things,
 including routing statuses to various subsystems (timelines, SMS,
 various backing stories, etc. etc.). This process also determines if
 the updating user is public or protected and if they are filtered
 from search. If the user is both public and unfiltered, the status is
 enqueued to both Search and Streaming.

 See also:
 http://apiwiki.twitter.com/Streaming-API-Documentation#ResultQualitynbsp

 The Streaming API then syndicates all these statuses. The Search
 system may or may not sort, filter, and otherwise rank statuses for
 relevance based on various heuristics, including, but not limited to:
 phase of the moon, state of tides, the DJIA, etc.

 Roughly:

 Complete corpus search: Streaming
 Low-latency results: Streaming
 Accurate keyword counts: Streaming (tally both statuses and limit
 messages)
 Complex queries: Search
 Historical queries: Search

 -John Kalucki
 http://twitter.com/jkalucki
 Services, Twitter Inc.



 On Nov 3, 11:02 am, Jeffrey Greenberg jeffreygreenb...@gmail.com
 wrote:
  It would help if John Kalucki (hello) would clarify the difference
  between what is visible via streaming as opposed to what is visible
  via search.
 
  I've been operating under the assumption that streaming is warranted
  when an app needs a different or more powerful search than the current
  one (e.g. nested boolean expressions), or is interested in seeing
  tweets before they are filtered out by twitter's spam detection
  (dealing with the tweet removal protocol, etc).)  As a developer it
  would help us if you could paint out what the twitter data pipeline
  looks like, and where the various apis plug in, so that we know what
  we get when we plug in there.  I assume, for instance, that search is
  farther downstream than the various firehose/stream apis, but I've
  little idea (or documentation) on what steps the data is as it moves
  down the pipe
 
  Would Twitter be open to shedding some light?.
  jeffrey greenberg
  tweettronics.com
 
  On Nov 3, 9:59 am, Fabien Penso fabienpe...@gmail.com wrote:
 
   I agree, however it would help a lot because instead of doing :
 
   for keyword in all_keywords
if tweet.match(keyword)
 //matched, notify users
end
   end
 
   we could do
 
   for keyword in keywords_matched
// same as above
   end
 
   for matching 5,000 keywords, it would bring the first loop from 5,000
   to probably 1 or 2.
   You know what you matched, so it's quiet easy for you just to include
   row data of matched keywords, I don't need anything fancy. Just space
   separated keywords would help _so much_.
 
   On Tue, Nov 3, 2009 at 3:15 PM, John Kalucki jkalu...@gmail.com
 wrote:
 
The assumption is that client services will, in any case, have to
parse and route statuses to potentially multiple end-users. Providing
this sort of hint wouldn't eliminate the need to parse the status and
would likely result in duplicate effort. We're aware that we are, in
some use cases, externalizing development effort, but the uses cases
for the Streaming API are so many, that it's hard to define exactly
how much this feature would help and therefore how much we're
externalizing.
 
-John Kalucki
   http://twitter.com/jkalucki
Services, Twitter Inc.
 
On Nov 3, 1:53 am, Fabien Penso fabienpe...@gmail.com wrote:
Hi.
 
Would it be possible to include the matched keywords in another
 field
within the result from the streaming/keyword API?
 
It would prevent matching those myself when matching for multiple
internal users, to spread the tweets to the legitimate users, which
can be time consuming and tough to do on lots of users/keywords.
 
Thanks.



[twitter-dev] Re: Streaming API Help

2009-09-19 Thread Joel Strellner
Greg,

Please see the following code for a working example fo how to do the
Streaming API in PHP.  Not sure if it is the best way to do it, but it works
for us, and we use the Streaming API for a lot of data.

http://groups.google.com/group/twitter-development-talk/msg/761eb0b23872d430?hl=en

If you have any questions, please let me know.

-Joel

On Sat, Sep 19, 2009 at 11:48 AM, Chad Etzel jazzyc...@gmail.com wrote:


 Hi Greg,

 A couple of questions:

 Are you running this is a browser? It will be much easier to test if
 you can run the script from a console instead. There are all sorts of
 output buffering issues to deal with in a browser.

 Even if you are receiving any data from the stream, the way your code
 is setup you won't ever see it:

   $new_data = json_decode($data);

   foreach($status as $new_data)
   {
   var_dump($status);
   echo hr;
   }

 Your foreach syntax is reversed, perhaps? The $new_data variable will
 be blank, but even worse since $status isn't set, the foreach loop
 won't run.

 Instead you should try something like:
 $new_data = json_decode($data);
 echo $data;
 var_dump($new_data
 echo hr;

 That way at least you can determine if you are getting any data from
 the stream at all.

 -Chad

 On Sat, Sep 19, 2009 at 10:31 AM, Greg gregory.av...@gmail.com wrote:
 
  I'm trying to implement Streaming API on my Twitter application to
  enhance the stabilty instead of using the Search API.
 
  Basically - I'm trying to use the track paramter to track a keyword.
  However, when I run this code - it just times out - nothing occurs.
  Perhaps I'm not using the track parameter correctly?
 
  I know I'm just dumping the code right now - but it was merely for
  test.
 
  My ad-hoc code is below:
 
  // twitter_stream.php
  ?
  $fp = fopen(http://username:passw...@stream.twitter.com/1/statuses/
  filter.json?track=ttl,wine, r);
  while($data = fgets($fp))
  {
 $new_data = json_decode($data);
 
 foreach($status as $new_data)
 {
 var_dump($status);
 echo hr;
 }
  }
  ?
 
  Thanks for the help!
 
  Greg



[twitter-dev] Re: Search by KW then RT

2009-09-09 Thread Joel Strellner
Something like this is going to require a server-side language, like PHP,
Perl, Ruby, etc.  JavaScript wont cut it.

I'm not sure how you'd do this without it being spammy though. IMO it should
be manual, or at least semi-manual using a queue of some sort that you need
to approve prior to it going out.

-Joel


On Wed, Sep 9, 2009 at 5:17 PM, 00__00 mrjchris...@gmail.com wrote:


 Can you help?

 I want to search Streaming OR Search (not fussed which) for Keyword
 (say Wimbledon) then ReTweet those individual results.

 1. Search
 2. ReTweet each search item including (via @xyz)

 Now I have seen this done in a spammy fashion, which is poor, so to
 reassure the group: I don;t wish to do this. I want to create a uber
 feed for my football team, hometown, and sports leagues in one...all
 automatically .

 The preferrred language is Javascript, but PHP could also work?

 Anyone know any libraries of such, or have any samples.

 Please bear in mind I am a lightweight programmer, and a bit
 embarrased to even ask on here!!

 Cheers



[twitter-dev] Re: tracking URLs using Streaming API

2009-09-08 Thread Joel Strellner
In order to accomplish this, you need to follow each and every link that
gets passed on Twitter to see if it matches the domain you are looking for.

We do this on inView for our clients, and it may be what you are looking
for: http://myinview.com (Note: This is our service, there may be others
doing this, but if so, I am unaware of them)

-Joel


On Tue, Sep 8, 2009 at 4:37 AM, Robert Chatley rob...@metabroadcast.comwrote:


 Hi,

 I want to use the Streaming API to track all the statuses that include
 URLs from a particular domain. e.g.

 * I really like http://mysite.com;,
 * http://mysite.com/goodstuff is great
 etc.

 Given the current track api this doesn't seem to be possible? If I
 specify http://mysite.com; as a keyword, it doesn't match the above
 cases. Is there a way to do that? A specific keyword syntax perhaps?
 We happen to be using the Twitter4J java client, but I don't think
 this is a client issue.

 Using the search api, specifying mysite.com (but not http://
 mysite.com)  finds the statuses we want to match, but we would rather
 use the streaming api rather than polling.

 thanks,
 Robert



[twitter-dev] Re: Is twitter a fad or worth development efforts?

2009-09-03 Thread Joel Strellner
Yeah, it's a fad, just like vowels and capitalization are.

On Thu, Sep 3, 2009 at 1:38 PM, Dale Merritt mogul...@gmail.com wrote:

 yea, like fb and ytube and ggle

 On Thu, Sep 3, 2009 at 9:41 AM, ka...@sbcglobal.net kakm...@gmail.comwrote:


 Is twitter a fad or worth development efforts?




 --
 Dale Merritt
 Fol.la MeDia, LLC



[twitter-dev] Re: Stream API Count Parameter

2009-09-01 Thread Joel Strellner
That would be very nice, but at this time it looks like count is the only
way to go.

Twitter: +1 for a since_id

-Joel

On Tue, Sep 1, 2009 at 2:47 PM, Sameer sameer.kha...@gmail.com wrote:


 Hello, my question is in regards to the Stream API and how to deal
 with getting statuses when the connection was lost.  I was wondering
 if there was another way to retrieve older status messages other then
 using the count parameter? It seems using the count parameter may be
 inaccurate since estimating an average number of statuses per second
 can be flawed due to spikes or other abnormal circumstances.  Is it
 possible to pass a status_id of the last status and get all new
 statuses from that point on?  Or perhaps pass a time stamp?

 If you could help it would be greatly appreciated, it is important
 that I maintain a complete stream of status messages.



[twitter-dev] Re: track syntax

2009-08-31 Thread Joel Strellner
Hi Polymatheus,

1) How can the above script be amended to show both follow and track?
 Is this possible?

John's suggestion is the only way, you must use the new paths.

2) If I opened a stream to follow 10 users and then a further 5 users
 joined my site, would I have to close the first stream then open a new
 stream for the 15 follows? Or just a second stream with the additional
 5 users? The second approach reduce the chance of tweets being lost
 between closing and opening a new stream.

You must restart the stream, unfortunately, there is no way around this at
this point.

3) I can't seem to close a stream that I opened using the sample code

Abraham gives a potential solution to this, and I believe its along the
right path.  I normally only see this if we are lagging behind for whatever
reason.

Let me know if you have any further questions, since I am the person that
wrote the code you're using.

-Joel


On Mon, Aug 31, 2009 at 12:32 PM, Polymatheus world.mo...@gmail.com wrote:


 I've used the code above to start streaming and then dumping the
 output to a text file every hour to process later. There are a few
 things I want to clarify,

 1) How can the above script be amended to show both follow and track?
 Is this possible?
 2) If I opened a stream to follow 10 users and then a further 5 users
 joined my site, would I have to close the first stream then open a new
 stream for the 15 follows? Or just a second stream with the additional
 5 users? The second approach reduce the chance of tweets being lost
 between closing and opening a new stream.
 3) I can't seem to close a stream that I opened using the sample code
 above, I have tried changing the file to simply say:


 $fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
 fclose($fp);

 But it doesn't appear to work :/

 Thanks in advance

 On 10 Aug, 08:09, Joel Strellner j...@twitturly.com wrote:
  Tom,
 
  Yes, that code works perfectly for me exactly as is.  You might want to
  change the connect timeout from 10 to 30 seconds.
 
  How long are you waiting before calling it quits?  It does take a few
  seconds for /track to start sending your results.
 
  -Joel
 
 
 
  On Sun, Aug 9, 2009 at 8:17 PM, Tom Fitzgerald ccexpe...@gmail.com
 wrote:
 
   I'm sorry Joel I keep getting a PHP timeout on the code you sent. I'll
   troubleshoot more and see if I can give you any more details (increase
   the maximum time, etc). Who knows, maybe its Twitter. Any other
   thoughts? I'll get back to you more with some detailed info. Are you
   able to get that exact code working on your server?
 
   On Aug 3, 2:12 pm, Joel Strellner j...@twitturly.com wrote:
Hi Tom,
 
I am not sure about XML, since I use JSON - it has a much lower
over-the-wire data size, and its easier to parse.
 
Let me know if the code works for you.
 
-Joel
 
On Mon, Aug 3, 2009 at 10:55 AM, Tom Fitzgerald ccexpe...@gmail.com
 
   wrote:
 
 I appreciate the reply Joel. I'll give it a try. I also tried just
 downloading from the stream api with the curl command line. However
 I
 kept getting 'malformed xml' errors. It was weird, each tweet would
 have the ?xml ... tag before it. That ring any bells with you?
 Same
 thing with JSON format but it was a different error, still
 malformed.
 All I'm doing is
 
 curlhttp://stream.twitter.com/spritzer.xml-uuser:pass
 
 ?xml version=1.0 encoding=UTF-8? is the exact line I get
 before
 every status. If I manually clean up the XML (or JSON) it works
 great.
 
 On Aug 2, 7:10 pm, Joel Strellner j...@twitturly.com wrote:
  Other than my username and password, this is an example that I
 know
   is
  working:
 
  ?php
  $count = 1;
  $startparsing = false;
 
  $keyword_needles[] = 'twitter';
  $keyword_needles[] = 'keyword2';
  $keyword_needles[] = 'keyword3';
  $keyword_needles[] = 'keyword4';
 
  // if your keywords have spaces, they must be urlencoded (twitter
   does
  not support phrases, only the first keyword will be used, the
 space
  character and after will be ignored)
  foreach ($keyword_needles AS $i=$needle) {
  $keyword_needles[$i] = urlencode($needle);
 
  }
 
  $poststr = 'track=' . implode(',', $keyword_needles);
  $fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
  if (!$fp) {
  echo $errstr ($errno)\n;
 
  } else {
 
  $out = POST /track.json HTTP/1.1\r\n;
  $out .= Host: stream.twitter.com\r\n;
  $out .= User-Agent: YourUserAgent\r\n;
  $out .= Referer:http://yourdomain.com\r\n;;
  $out .= Content-Type:
 application/x-www-form-urlencoded\r\n;
  $out .= Authorization: Basic  . base64_encode
  (username:password).\r\n;
  $out .= Content-length:  . strlen($poststr) . \r\n;
  $out .= Connection: Close\r\n\r\n;
  $out .= $poststr . \r\n\r\n;
 
  fwrite($fp, $out);
  while

[twitter-dev] Re: track syntax

2009-08-31 Thread Joel Strellner
If he really is using only those two lines, then yes, it explains some of
his error.  The header lines are what tells the server what page you're
looking for on stream.twitter.com.

-Joel

On Mon, Aug 31, 2009 at 1:33 PM, Joseph Cheek jos...@cheek.com wrote:


 are you really just opening stream.twitter.org?  Normally you would want
 to open http://stream.twitter.org/path/to/url.xml...

 Joseph Cheek
 jos...@cheek.com, www.cheek.com
 twitter: http://twitter.com/cheekdotcom



 John Kalucki wrote:
  You can set both the track and follow parameters when using the /1/
  statuses/filter URL.
 
  Best practices around changing your predicate:
 
 http://apiwiki.twitter.com/Streaming-API-Documentation#UpdatingFilterPredicates
 
  I can't answer PHP questions, sorry.
 
  -John Kalucki
  http://twitter.com/jkalucki
  Services, Twitter Inc.
 
 
  On Aug 31, 12:32 pm, Polymatheus world.mo...@gmail.com wrote:
 
  I've used the code above to start streaming and then dumping the
  output to a text file every hour to process later. There are a few
  things I want to clarify,
 
  1) How can the above script be amended to show both follow and track?
  Is this possible?
  2) If I opened a stream to follow 10 users and then a further 5 users
  joined my site, would I have to close the first stream then open a new
  stream for the 15 follows? Or just a second stream with the additional
  5 users? The second approach reduce the chance of tweets being lost
  between closing and opening a new stream.
  3) I can't seem to close a stream that I opened using the sample code
  above, I have tried changing the file to simply say:
 
  $fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
  fclose($fp);
 
  But it doesn't appear to work :/
 
  Thanks in advance
 
 




[twitter-dev] Re: Streaming API -- CHANGE REQUIRED -- URL rationalization

2009-08-26 Thread Joel Strellner
When does this change go into effect?

-Joel

On Wed, Aug 26, 2009 at 12:06 PM, John Kalucki jkalu...@gmail.com wrote:


 The resources in the Streaming API have been rationalized. You'll need
 to update the URLs that streaming clients are using over the next two
 weeks. The old URLs will be deprecated on or after September 9, 2009.
 The change is documented in the Wiki:
 http://apiwiki.twitter.com/Streaming-API-Documentation,
 specifically in
 http://apiwiki.twitter.com/Streaming-API-Documentation#Methods.

 The new scheme allows for API versioning, streams that contain objects
 other than statuses, separates access level control from URLs and
 allows multiple filter predicates to be specified on a single
 connection. The cute resource names have, sadly, been dropped we move
 towards pushing the service out of Alpha. Also, /track and friends
 have been merged with /follow and friends into a single resource. When
 you connect to a given resource, you will automatically be given the
 highest access level possible.

 The following is a mapping from the old URLs to the new URLs.
 Otherwise, you should notice only subtle changes to the Streaming API
 error handling behavior. All other functionality should continue to
 work as in the past.

 /firehose - /1/statuses/firehose
 /spritzer, /gardenhose - /1/statuses/sample
 /birddog, /shadow, /follow - /1/statuses/filter
 /partner/track, /restricted/track, /track - /1/statuses/filter

 For example, if you have been connecting to /gardenhose.json, connect
 to /1/statuses/sample.json.

 Note that the Streaming API is still in Alpha test.

 -John Kalucki
 http://twitter.com/jkalucki
 Services, Twitter Inc.





[twitter-dev] Re: efficient application that follows keywords

2009-08-23 Thread Joel Strellner
Look into the track streaming API method.  That should be the least
resource intensive way to track keywords.

If you can't use the streaming API, make sure you use the since_id
parameter.

-Joel

On Sun, Aug 23, 2009 at 4:29 AM, Qcho victor.p...@gmail.com wrote:


 Hello

 I am a new twitter developer and I am developing an application that
 follows several keywords. Each keyword will have its URL .com/
 keyw1 http://.com/%0Akeyw1 and the page will show the tweets
 containing that keyword.

 I am planning to use the Search API.

 Because the API returns only the matches for the latest x days (30?),
 I will have to do a cache of the posts in my local database to include
 tweets older than x days in my results.
 And with this method I can use the since_id parameter and not overflow
 twitter's servers.

 The number of keywords can grow very big, so I will update more
 frequently the popular keywords than the less popular ones. Is there
 any other API restriction/limit I should be aware of, or any other
 thing I can do to make the queries efficient and dont waste mine or
 twitter's resources?

 Thank you

 Victor



[twitter-dev] Re: Stop playing around with Source parameters

2009-08-22 Thread Joel Strellner
Ummm... strip_tags()'s?

On Fri, Aug 21, 2009 at 9:17 PM, TCI ticoconid...@gmail.com wrote:


 Recently you added nofollow's, and now you moved the nofollow after
 the href. Some of us filter these out and you changing them is only
 making it more complicated. Please make up your mind and stop changing
 these...

 a href=http://fun140.com/;Fun140/a

 a rel=nofollow href=http://fun140.com/;Fun140/a

 a href=http://fun140.com/; rel=nofollowFun140/a



[twitter-dev] Re: Track vs. Search

2009-08-20 Thread Joel Strellner
No, and Yes.  Track does not allow for getting that high of a number of
results, you will be throttled, thus you will miss links.

-Joel

On Thu, Aug 20, 2009 at 7:21 AM, Mark Nutter marknut...@gmail.com wrote:


 I'm writing an application that needs a realtime stream of the links
 being posted on Twitter.  I've been using the search api with a search
 for the term 'http' but I stumbled across the track firehose method
 which appears to offer the same functionality but with the added bonus
 of including the user profile information inline.  Does track provide
 the same real-time stream that the search api does?  Would I be
 missing links at all by using track?  If not, I'll switch over to that
 immediately.  This will greatly reduce the number of API calls I need
 to make to update user information.  Thanks!



[twitter-dev] Re: Developer Preview: Geolocation API

2009-08-20 Thread Joel Strellner
Hi Ryan,

Will this data be available in the streaming API too?

-Joel

On Thu, Aug 20, 2009 at 2:11 PM, @epc epcoste...@gmail.com wrote:


 Will twitter validate the coordinates (ie, what will the API do when I
 pass lat=777long=-666)?

 If the coordinates are invalid, will the status get posted or will the
 entire request get rejected with a 4xx code?

 If a user has not enabled geolocating (geo_enabledfalse/
 geo_enabled), what happens if I pass in coordinates for that user?
 Silently ignored?

 Geo data will be attached to individual tweets and not users, right?
 This will have no effect on the location field in a user profile?
 --
 -ed costello



[twitter-dev] Re: track syntax

2009-08-10 Thread Joel Strellner
Tom,

Yes, that code works perfectly for me exactly as is.  You might want to
change the connect timeout from 10 to 30 seconds.

How long are you waiting before calling it quits?  It does take a few
seconds for /track to start sending your results.

-Joel


On Sun, Aug 9, 2009 at 8:17 PM, Tom Fitzgerald ccexpe...@gmail.com wrote:


 I'm sorry Joel I keep getting a PHP timeout on the code you sent. I'll
 troubleshoot more and see if I can give you any more details (increase
 the maximum time, etc). Who knows, maybe its Twitter. Any other
 thoughts? I'll get back to you more with some detailed info. Are you
 able to get that exact code working on your server?

 On Aug 3, 2:12 pm, Joel Strellner j...@twitturly.com wrote:
  Hi Tom,
 
  I am not sure about XML, since I use JSON - it has a much lower
  over-the-wire data size, and its easier to parse.
 
  Let me know if the code works for you.
 
  -Joel
 
 
 
  On Mon, Aug 3, 2009 at 10:55 AM, Tom Fitzgerald ccexpe...@gmail.com
 wrote:
 
   I appreciate the reply Joel. I'll give it a try. I also tried just
   downloading from the stream api with the curl command line. However I
   kept getting 'malformed xml' errors. It was weird, each tweet would
   have the ?xml ... tag before it. That ring any bells with you? Same
   thing with JSON format but it was a different error, still malformed.
   All I'm doing is
 
   curlhttp://stream.twitter.com/spritzer.xml-uuser:pass
 
   ?xml version=1.0 encoding=UTF-8? is the exact line I get before
   every status. If I manually clean up the XML (or JSON) it works great.
 
   On Aug 2, 7:10 pm, Joel Strellner j...@twitturly.com wrote:
Other than my username and password, this is an example that I know
 is
working:
 
?php
$count = 1;
$startparsing = false;
 
$keyword_needles[] = 'twitter';
$keyword_needles[] = 'keyword2';
$keyword_needles[] = 'keyword3';
$keyword_needles[] = 'keyword4';
 
// if your keywords have spaces, they must be urlencoded (twitter
 does
not support phrases, only the first keyword will be used, the space
character and after will be ignored)
foreach ($keyword_needles AS $i=$needle) {
$keyword_needles[$i] = urlencode($needle);
 
}
 
$poststr = 'track=' . implode(',', $keyword_needles);
$fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
if (!$fp) {
echo $errstr ($errno)\n;
 
} else {
 
$out = POST /track.json HTTP/1.1\r\n;
$out .= Host: stream.twitter.com\r\n;
$out .= User-Agent: YourUserAgent\r\n;
$out .= Referer:http://yourdomain.com\r\n;;
$out .= Content-Type: application/x-www-form-urlencoded\r\n;
$out .= Authorization: Basic  . base64_encode
(username:password).\r\n;
$out .= Content-length:  . strlen($poststr) . \r\n;
$out .= Connection: Close\r\n\r\n;
$out .= $poststr . \r\n\r\n;
 
fwrite($fp, $out);
while (!feof($fp)) {
$line = fgets($fp, 4096);
if ($startparsing) {
if (trim($line) != '') {
echo trim($line) . \n;
$tweet_obj = json_decode(trim($line));
// do your stuff here
}
}
else {
// view the header lines: uncomment the below line
echo trim($line) . \n;
 
$header_arr[] = $line;
$headercount = count($header_arr)-1;
 
if (trim($header_arr[$headercount]) == '') {
$startparsing = true;
$count = 1;
unset($header_arr, $headercount);
}
}
if (trim($line) != '') $count++;
}
fclose($fp);}
 
?
 
The only changes I made was to echo things out and I added a keyword
that I was sure would have volume - twitter.  It did take a little
 bit
of time to connect.  I am assuming that that is because of their
current load though, and not the script.
 
On Aug 2, 2:25 am, Tom Fitzgerald ccexpe...@gmail.com wrote:
 
Joel,
 
 For some reason when I try your code I get a timeout error. Any
 suggestions? What you have is exactly what I'm looking for. It
 could
 really help me out a jam, thanks!
 
 On Jul 27, 4:02 pm,JoelStrellner j...@twitturly.com wrote:
 
  Here is a working example of how to do /track:
 
  $count = 1;
  $startparsing = false;
 
  $keyword_needles[] = 'keyword1';
  $keyword_needles[] = 'keyword2';
  $keyword_needles[] = 'keyword3';
  $keyword_needles[] = 'keyword4';
 
  // if your keywords have spaces, they must be urlencoded (twitter
   does not
  support phrases, only the first keyword will be used, the space
   character
  and after will be ignored)
  foreach ($keyword_needles AS $i=$needle) {
  $keyword_needles[$i] = urlencode($needle);
 
  }
 
  $poststr = 'track=' . implode

[twitter-dev] Re: track syntax

2009-08-03 Thread Joel Strellner
Hi Tom,

I am not sure about XML, since I use JSON - it has a much lower
over-the-wire data size, and its easier to parse.

Let me know if the code works for you.

-Joel

On Mon, Aug 3, 2009 at 10:55 AM, Tom Fitzgerald ccexpe...@gmail.com wrote:


 I appreciate the reply Joel. I'll give it a try. I also tried just
 downloading from the stream api with the curl command line. However I
 kept getting 'malformed xml' errors. It was weird, each tweet would
 have the ?xml ... tag before it. That ring any bells with you? Same
 thing with JSON format but it was a different error, still malformed.
 All I'm doing is

 curl http://stream.twitter.com/spritzer.xml -uuser:pass

 ?xml version=1.0 encoding=UTF-8? is the exact line I get before
 every status. If I manually clean up the XML (or JSON) it works great.

 On Aug 2, 7:10 pm, Joel Strellner j...@twitturly.com wrote:
  Other than my username and password, this is an example that I know is
  working:
 
  ?php
  $count = 1;
  $startparsing = false;
 
  $keyword_needles[] = 'twitter';
  $keyword_needles[] = 'keyword2';
  $keyword_needles[] = 'keyword3';
  $keyword_needles[] = 'keyword4';
 
  // if your keywords have spaces, they must be urlencoded (twitter does
  not support phrases, only the first keyword will be used, the space
  character and after will be ignored)
  foreach ($keyword_needles AS $i=$needle) {
  $keyword_needles[$i] = urlencode($needle);
 
  }
 
  $poststr = 'track=' . implode(',', $keyword_needles);
  $fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
  if (!$fp) {
  echo $errstr ($errno)\n;
 
  } else {
 
  $out = POST /track.json HTTP/1.1\r\n;
  $out .= Host: stream.twitter.com\r\n;
  $out .= User-Agent: YourUserAgent\r\n;
  $out .= Referer:http://yourdomain.com\r\n;;
  $out .= Content-Type: application/x-www-form-urlencoded\r\n;
  $out .= Authorization: Basic  . base64_encode
  (username:password).\r\n;
  $out .= Content-length:  . strlen($poststr) . \r\n;
  $out .= Connection: Close\r\n\r\n;
  $out .= $poststr . \r\n\r\n;
 
  fwrite($fp, $out);
  while (!feof($fp)) {
  $line = fgets($fp, 4096);
  if ($startparsing) {
  if (trim($line) != '') {
  echo trim($line) . \n;
  $tweet_obj = json_decode(trim($line));
  // do your stuff here
  }
  }
  else {
  // view the header lines: uncomment the below line
  echo trim($line) . \n;
 
  $header_arr[] = $line;
  $headercount = count($header_arr)-1;
 
  if (trim($header_arr[$headercount]) == '') {
  $startparsing = true;
  $count = 1;
  unset($header_arr, $headercount);
  }
  }
  if (trim($line) != '') $count++;
  }
  fclose($fp);}
 
  ?
 
  The only changes I made was to echo things out and I added a keyword
  that I was sure would have volume - twitter.  It did take a little bit
  of time to connect.  I am assuming that that is because of their
  current load though, and not the script.
 
  On Aug 2, 2:25 am, Tom Fitzgerald ccexpe...@gmail.com wrote:
 
 
 
  Joel,
 
   For some reason when I try your code I get a timeout error. Any
   suggestions? What you have is exactly what I'm looking for. It could
   really help me out a jam, thanks!
 
   On Jul 27, 4:02 pm,JoelStrellner j...@twitturly.com wrote:
 
Here is a working example of how to do /track:
 
$count = 1;
$startparsing = false;
 
$keyword_needles[] = 'keyword1';
$keyword_needles[] = 'keyword2';
$keyword_needles[] = 'keyword3';
$keyword_needles[] = 'keyword4';
 
// if your keywords have spaces, they must be urlencoded (twitter
 does not
support phrases, only the first keyword will be used, the space
 character
and after will be ignored)
foreach ($keyword_needles AS $i=$needle) {
$keyword_needles[$i] = urlencode($needle);
 
}
 
$poststr = 'track=' . implode(',', $keyword_needles);
$fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 30);
if (!$fp) {
echo $errstr ($errno)\n;} else {
 
$out = POST /track.json HTTP/1.1\r\n;
$out .= Host: stream.twitter.com\r\n;
$out .= User-Agent: YourUserAgent\r\n;
$out .= Referer:http://yourdomain.com\r\n;;
$out .= Content-Type: application/x-www-form-urlencoded\r\n;
$out .= Authorization: Basic  .
base64_encode(username:password).\r\n;
$out .= Content-length:  . strlen($poststr) . \r\n;
$out .= Connection: Close\r\n\r\n;
$out .= $poststr . \r\n\r\n;
 
fwrite($fp, $out);
while (!feof($fp)) {
$line = fgets($fp, 4096);
if ($startparsing) {
if (trim($line) != '') {
//echo trim($line) . \n;
$tweet_obj = json_decode(trim($line));
// do your

[twitter-dev] Re: track syntax

2009-08-02 Thread Joel Strellner

Other than my username and password, this is an example that I know is
working:

?php
$count = 1;
$startparsing = false;

$keyword_needles[] = 'twitter';
$keyword_needles[] = 'keyword2';
$keyword_needles[] = 'keyword3';
$keyword_needles[] = 'keyword4';

// if your keywords have spaces, they must be urlencoded (twitter does
not support phrases, only the first keyword will be used, the space
character and after will be ignored)
foreach ($keyword_needles AS $i=$needle) {
$keyword_needles[$i] = urlencode($needle);
}

$poststr = 'track=' . implode(',', $keyword_needles);
$fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 10);
if (!$fp) {
echo $errstr ($errno)\n;
} else {

$out = POST /track.json HTTP/1.1\r\n;
$out .= Host: stream.twitter.com\r\n;
$out .= User-Agent: YourUserAgent\r\n;
$out .= Referer: http://yourdomain.com\r\n;;
$out .= Content-Type: application/x-www-form-urlencoded\r\n;
$out .= Authorization: Basic  . base64_encode
(username:password).\r\n;
$out .= Content-length:  . strlen($poststr) . \r\n;
$out .= Connection: Close\r\n\r\n;
$out .= $poststr . \r\n\r\n;

fwrite($fp, $out);
while (!feof($fp)) {
$line = fgets($fp, 4096);
if ($startparsing) {
if (trim($line) != '') {
echo trim($line) . \n;
$tweet_obj = json_decode(trim($line));
// do your stuff here
}
}
else {
// view the header lines: uncomment the below line
echo trim($line) . \n;

$header_arr[] = $line;
$headercount = count($header_arr)-1;

if (trim($header_arr[$headercount]) == '') {
$startparsing = true;
$count = 1;
unset($header_arr, $headercount);
}
}
if (trim($line) != '') $count++;
}
fclose($fp);
}
?

The only changes I made was to echo things out and I added a keyword
that I was sure would have volume - twitter.  It did take a little bit
of time to connect.  I am assuming that that is because of their
current load though, and not the script.



On Aug 2, 2:25 am, Tom Fitzgerald ccexpe...@gmail.com wrote:
 Joel,

 For some reason when I try your code I get a timeout error. Any
 suggestions? What you have is exactly what I'm looking for. It could
 really help me out a jam, thanks!

 On Jul 27, 4:02 pm, Joel Strellner j...@twitturly.com wrote:

  Here is a working example of how to do /track:

  $count = 1;
  $startparsing = false;

  $keyword_needles[] = 'keyword1';
  $keyword_needles[] = 'keyword2';
  $keyword_needles[] = 'keyword3';
  $keyword_needles[] = 'keyword4';

  // if your keywords have spaces, they must be urlencoded (twitter does not
  support phrases, only the first keyword will be used, the space character
  and after will be ignored)
  foreach ($keyword_needles AS $i=$needle) {
      $keyword_needles[$i] = urlencode($needle);

  }

  $poststr = 'track=' . implode(',', $keyword_needles);
  $fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 30);
  if (!$fp) {
      echo $errstr ($errno)\n;} else {

      $out = POST /track.json HTTP/1.1\r\n;
      $out .= Host: stream.twitter.com\r\n;
      $out .= User-Agent: YourUserAgent\r\n;
      $out .= Referer:http://yourdomain.com\r\n;;
      $out .= Content-Type: application/x-www-form-urlencoded\r\n;
      $out .= Authorization: Basic  .
  base64_encode(username:password).\r\n;
      $out .= Content-length:  . strlen($poststr) . \r\n;
      $out .= Connection: Close\r\n\r\n;
      $out .= $poststr . \r\n\r\n;

      fwrite($fp, $out);
      while (!feof($fp)) {
          $line = fgets($fp, 4096);
          if ($startparsing) {
              if (trim($line) != '') {
                  //echo trim($line) . \n;
                  $tweet_obj = json_decode(trim($line));
                  // do your stuff here
              }
          }
          else {
              // view the header lines: uncomment the below line
              //echo trim($line) . \n;

              $header_arr[] = $line;
              $headercount = count($header_arr)-1;

              if (trim($header_arr[$headercount]) == '') {
                  $startparsing = true;
                  $count = 1;
                  unset($header_arr, $headercount);
              }
          }
          if (trim($line) != '') $count++;
      }
      fclose($fp);

  }
  On Mon, Jul 27, 2009 at 11:18 AM, Joseph northwest...@gmail.com wrote:

   I am trying to use the track streaming API, but I'm having trouble
   with the syntax. The example shows how to use a tracking list stored
   in a file in the format: track = word1, word2, etc..

   I tried the following (following a successful fsockopen call to
   stream.twitter.api:

   POST /track.json track = Palin, #fubar HTTP/1.1 Host:
   stream.twitter.com User-Agent: UserAgent Authorization: Basic
   cGXybmFzc3TzZGV2OlBhcm5hMzT1Mzl4MDMz Connection: Close

   and I am getting

[twitter-dev] Re: track syntax

2009-07-27 Thread Joel Strellner
Here is a working example of how to do /track:

$count = 1;
$startparsing = false;

$keyword_needles[] = 'keyword1';
$keyword_needles[] = 'keyword2';
$keyword_needles[] = 'keyword3';
$keyword_needles[] = 'keyword4';

// if your keywords have spaces, they must be urlencoded (twitter does not
support phrases, only the first keyword will be used, the space character
and after will be ignored)
foreach ($keyword_needles AS $i=$needle) {
$keyword_needles[$i] = urlencode($needle);
}

$poststr = 'track=' . implode(',', $keyword_needles);
$fp = fsockopen(stream.twitter.com, 80, $errno, $errstr, 30);
if (!$fp) {
echo $errstr ($errno)\n;
} else {
$out = POST /track.json HTTP/1.1\r\n;
$out .= Host: stream.twitter.com\r\n;
$out .= User-Agent: YourUserAgent\r\n;
$out .= Referer: http://yourdomain.com\r\n;;
$out .= Content-Type: application/x-www-form-urlencoded\r\n;
$out .= Authorization: Basic  .
base64_encode(username:password).\r\n;
$out .= Content-length:  . strlen($poststr) . \r\n;
$out .= Connection: Close\r\n\r\n;
$out .= $poststr . \r\n\r\n;

fwrite($fp, $out);
while (!feof($fp)) {
$line = fgets($fp, 4096);
if ($startparsing) {
if (trim($line) != '') {
//echo trim($line) . \n;
$tweet_obj = json_decode(trim($line));
// do your stuff here
}
}
else {
// view the header lines: uncomment the below line
//echo trim($line) . \n;

$header_arr[] = $line;
$headercount = count($header_arr)-1;

if (trim($header_arr[$headercount]) == '') {
$startparsing = true;
$count = 1;
unset($header_arr, $headercount);
}
}
if (trim($line) != '') $count++;
}
fclose($fp);
}


On Mon, Jul 27, 2009 at 11:18 AM, Joseph northwest...@gmail.com wrote:


 I am trying to use the track streaming API, but I'm having trouble
 with the syntax. The example shows how to use a tracking list stored
 in a file in the format: track = word1, word2, etc..

 I tried the following (following a successful fsockopen call to
 stream.twitter.api:

 POST /track.json track = Palin, #fubar HTTP/1.1 Host:
 stream.twitter.com User-Agent: UserAgent Authorization: Basic
 cGXybmFzc3TzZGV2OlBhcm5hMzT1Mzl4MDMz Connection: Close

 and I am getting the following error code: HTTP/1.1 400 Bad Request

 The actual relevant test PHP code is:

 $fp = fsockopen($url, 80, $errno, $errstr, 30);
 if (!$fp) {
echo $errstr ($errno)\n;
 } else {
echo file handle: .$fp.br/;
$header = POST /track.json track = $trackTerms HTTP/1.1\r\n;
$header .= Host: stream.twitter.com\r\n;
$header .= User-Agent: UserAgent\r\n;
$header .= Authorization: Basic  . base64_encode($twitUsername .
 ':' . $twitPwd).\r\n;
$header .= Connection: Close\r\n\r\n;
echo $header.br/;
fwrite($fp, $header);
$line = fgets($fp, 4096);
echo $line;
fclose($fp);
 }



[twitter-dev] Re: Search / track term

2009-07-24 Thread Joel Strellner
No.  If you are tracking 3 things, for example, the only way to determine
which of those 3 terms matched, would be for you to search within the tweet
for your terms and determine it yourself.

On Fri, Jul 24, 2009 at 5:30 PM, Joseph northwest...@gmail.com wrote:


 If I'm tracking a hash tag (using the streaming API), will that hash
 tag (or search term), be returned as part of the JSON stream I'm
 receiving?



[twitter-dev] Re: The Gardenhose Cooperative

2009-07-22 Thread Joel Strellner
I wonder if there is a way that Twitter could do the verification.  Self
verification is always vulnerable.  It'd be nice if Twitter had some sort of
way to be involved, and tell the provider of the backed up data what level
of access that a user has.

On Wed, Jul 22, 2009 at 3:41 PM, braver delivera...@gmail.com wrote:


 After we lost a few days of gardenhose, I'm wondering whether it would
 be OK for us gardenhosers to back up each other.  In case we do
 research, for instance -- as we do at Dartmouth.

  I suggest the following: say you lost a day or a few within the range
 since you were authorized, and are a member of our garden variety
 cooperative.  You ask me to fill you in, and tell me the day you
 started gathering the hose.  I pick a day for which you have data, and
 ask you to verify a few tweets somehow -- e.g. tell me which tweet ids
 there are for a certain user id.

 Would it be OK to self-organize like that, and who'd be our buddy?
 Cheers,
 Alexy



[twitter-dev] Re: Ideas for URL autodetection (like tweetmeme RT-Button)?

2009-07-20 Thread Joel Strellner
Most blogging platforms have a way to insert the posts permanent URL.  Maybe
look into that?

On Mon, Jul 20, 2009 at 4:06 AM, Bjoern bjoer...@googlemail.com wrote:


 I was interested to read about the way Tweetmeme sets up it's retweet
 button: http://help.tweetmeme.com/2009/04/06/tweetmeme-button/

 I also want to create a pure javascript Twitter service that people
 could integrate into their web site. It also needs to know the URL of
 the site it is sitting on, so that people can tweet through it, and it
 can find the respective tweets.

 The problem is that for example with a typical Wordpress blog, an
 article (containing the script) would appear on multiple sites, so URL
 autodetection doesn't seem like a good option. Even specifying the URL
 beforehand seems less than ideal, because blog writers won't
 necessarily know the URL of an article before they submit it. So they
 would have to submit the article and then go back and

 I suppose Tweetmeme gets around this by providing a Wordpress plugin,
 but that would be overkill for me atm (though conceivable for the
 future).

 Just wondering if there are any other ideas out there to approach the
 problem?

 Björn



[twitter-dev] Re: Keep getting suspended

2009-07-20 Thread Joel Strellner
Can you say what your account is doing?  Sounds like you are getting
suspended for a good reason if it has happened twice already.

On Mon, Jul 20, 2009 at 5:00 AM, sjespers se...@webkitchen.be wrote:


 That could work but if I got suspended twice in two days, I'm sure a
 test account won't last long. I need to know what is going wrong and
 why my whitelisted account and service get suspended.



 On Jul 20, 1:48 pm, Andrew Badera and...@badera.us wrote:
  On Mon, Jul 20, 2009 at 7:18 AM, sjespers se...@webkitchen.be wrote:
 
   Any news on this? My account is still suspended. I'm can't continue
   work on my app.
 
  Register another test account in the meantime?



[twitter-dev] Re: There's a problem with the profile.

2009-07-16 Thread Joel Strellner

Twitter caches the profile pics that you circled.  It will catch up, as long
as the other 4 that aren't shown aren't suspended, spammers, etc.

-Joel

-Original Message-
From: twitter-development-talk@googlegroups.com
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of DJXpander
Sent: Wednesday, July 15, 2009 9:04 PM
To: Twitter Development Talk
Subject: [twitter-dev] There's a problem with the profile.


Hi, I don't know where to put this so I wanted to post it here, I been
having some problems with my profile, I took a screen shot of it to
understand my problem:

http://i30.tinypic.com/drclya.jpg

As you can see the following list is not right. Is there a way to fix
this?



[twitter-dev] Re: API Developers Alliance

2009-07-16 Thread Joel Strellner
Not sure that there needs to be formal alliance, but a working group that
has the ear of twitter and can make sure needs are being met from both the
developers and Twitters perspective would be good.

 

On that same note though, I feel that twitter has done a pretty good job
with this balance so far, and I do not feel that they'd do anything to
hinder developers.  As much as twitter is about being a communications tool,
it is also a platform.  I think they realizes this, and hindering the
developers kills the platform.

 

So, while an alliance might be helpful, personally, I also do not see it
changing anything much. 

 

-Joel

 

 

From: twitter-development-talk@googlegroups.com
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of Peter Denton
Sent: Thursday, July 16, 2009 2:35 PM
To: twitter-development-talk@googlegroups.com
Subject: [twitter-dev] API Developers Alliance

 

There is a lot of ambiguity up in the air, about api devs (third party) and
the future of the api and twitter. Apps are a huge growth vehicle and a very
significant piece of the future, getting the Twitter medium a global
behavior. 

I believe there should be a formal alliance of third party developers to
ensure that Apps have rights. The ambiguity around down the road, if and
when scenarios, leave many investors weary, teams unformed, and products
unbuilt because at the end of the day, people have to consider a massive
acquisition or change where suddenly apps are crushed by the parent. Twitter
is not facebook and potential for sustainable/profitable products and
services around this medium are real. If you don't agree, that's fine. I am
seeking those who believe this and want to address this.

This is not meant to be a big, serious thing. This is meant to be an action
item to those who want a developer bill of rights to happen, with
input/voice from an organized approach, and want to create some level of
insurance, to go out to investors/partners with an approach.

If anyone would like to discuss this, please let me know off the list. I am
not trying to irritate people, just gauge people's interest.

Regards
Peter 


 



[twitter-dev] Re: Searching for tweets that refer to an URL still impossible with bit.ly (and others)

2009-07-15 Thread Joel Strellner

There are 3 API's that I know of that you can use:

Twitturly (Ours - Private beta only at the moment)
Tweetmeme
BackTweet

Between the 3 of us, I am sure you can accomplish whatever your end-goal is.

I do not think BackTweet processes all URLs, so they may not have a URL, but
I do know that we do and Tweetmeme does.

-Joel


-Original Message-
From: twitter-development-talk@googlegroups.com
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of Bjoern
Sent: Wednesday, July 15, 2009 9:16 AM
To: Twitter Development Talk
Subject: [twitter-dev] Re: Searching for tweets that refer to an URL still
impossible with bit.ly (and others)


On Jul 15, 5:57 pm, Matt Sanford m...@twitter.com wrote:

 Have you thought about using one of the APIs built for this,  
 like backtweets [1]?

I thought about them, but only as a last resort. Did not know about
backtweets - they look good, but they also have a limit of 1000 calls/
day. I had also looked into FriendFeed, but they seem to only return
people who are also on FriendFeed.

Björn




[twitter-dev] Re: New app: twivert.com

2009-07-14 Thread Joel Strellner

I think its fine as long as it is done only once for the app and the
user interacts here more than just for purpose of telling people about
the new app.

On Jul 14, 10:19 am, Chad Etzel jazzyc...@gmail.com wrote:
 On Tue, Jul 14, 2009 at 12:16 PM, Andrew Baderaand...@badera.us wrote:
  When did this dev list become a self-promotion list? Can we knock this
  garbage off already? I get enough spam ON Twitter these days, I don't need
  it coming into my inbox via the dev list.

 I thought one of the purposes of the list was to promote/announce your
 new apps when they go live so that other devs are aware of new stuff
 coming out.

 Whether you like the app or not is up to you...

 -Chad


[twitter-dev] Re: Is it okay to close a connection by opening a new one?

2009-07-14 Thread Joel Strellner
Why can't you do this entirely in your code?  Why do you need to close the
connection and reconnect?

 

Closing a file, moving it, and then creating a new file should be able to be
done extremely fast, thus you shouldn't need to close your connection to
Twitter.

 

Also, if at all possible, JSON is a much better format to use.  It's smaller
over the wire, and it'll create smaller files.

 

-Joel

 

 

From: twitter-development-talk@googlegroups.com
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of Alex Payne
Sent: Tuesday, July 14, 2009 4:07 PM
To: twitter-development-talk@googlegroups.com
Subject: [twitter-dev] Re: Is it okay to close a connection by opening a new
one?

 

If you're only doing this every hour, that's fine by us.

On Tue, Jul 14, 2009 at 15:58, owkaye owk...@gmail.com wrote:


The Streaming API docs say we should avoid opening new
connections with the same user:pass when that user already
has a connection open.  But I'm hoping it is okay to do this
every hour or so, here's why:

My plan is to write the streaming XML data to a text file
during each connection -- but I don't want this file to get
so big that I have trouble processing it on the back end.
Therefore I want to rotate these files every hour ...

This means I have to stop writing to the file, close it, move
it somewhere else, and create a new file so I can use the new
file to continue storing new streaming XML data.

The obvious way for me to close these files is to close the
connection -- by opening a new connection -- because from
what I've read it seems that opening a new connection forces
the previous connection to close.

Can I do this without running into any black listing or
denial of service issues?  I mean, is this an acceptable way
to close a connection ... by opening a new one in order to
force the old connection to close?

Any info you can provide that will clarify this issue is
greatly appreciated, thanks!

Owkaye









-- 
Alex Payne - Platform Lead, Twitter, Inc.
http://twitter.com/al3x



[twitter-dev] Re: Streaming API -- Additional markup added -- Deletion notifications on track streams.

2009-07-13 Thread Joel Strellner


Wow John, that was quick. Thank you.

-Joel

On Jul 13, 2009, at 7:53 PM, John Kalucki jkalu...@gmail.com wrote:



In addition to deletion notices, limitation notices will be added to
track streams. These notices will be enabled on or after Tuesday July
14th.

Deletions will be enabled on or after Thursday July 16th, as
previously scheduled.


From the wiki,  http://apiwiki.twitter.com/Streaming-API-Documentation:


Streams may also contain status deletion notices. Clients are urged
to  honor deletion requests and discard deleted statuses immediately.
* XML:  deletestatusid1234/iduser_id3/user_id/
status/delete
   * JSON: { delete: { status: { id: 1234, user_id: 3 } } }

Track streams may also contain limitation notices, where the integer
track is an enumeration of statuses that matched the track predicate
but were administratively limited. These notices will be sent each
time a limited stream becomes unlimited.
   * XML:  limittrack1234/tracklimit
   * JSON: { limit: { track: 1234 } }


-John Kalucki
twitter.com/jkalucki

Services, Twitter Inc.


[twitter-dev] Re: Streaming API -- New objects in markup stream. Test your code

2009-07-11 Thread Joel Strellner


John,

Any chance you can allow us to send an additional variable when we  
connect and you guys send it in the new format? This would allow for  
overlap and testing.


-Joel

On Jul 11, 2009, at 7:04 AM, John Kalucki jkalu...@gmail.com wrote:



Laurent,

There are examples of the new objects on the Streaming API wiki. The
XML and JSON formats are, sadly, not orthogonal. The objects aren't
flowing to give developers time to adjust. We'll probably enable this
in the middle of next week.

-John



On Jul 11, 12:34 am, Laurent Eschenauer laurent.eschena...@gmail.com
wrote:

Hi John,

I can't find such thing as a status or delete object in the JSON  
feed.
There is indeed a status enveloppe in the XML, but the  
corresponding

JSON object seems to be already one level deeper, only encapsulating
the data from the status itself.

Could you please clarify what we should be expecting to see in JSON ?
And maybe also provide some sample objects in the Wiki, both for XML
and JSON ?

Thanks !

-Laurent

On Jul 10, 8:00 pm, John Kalucki jkalu...@gmail.com wrote:


Note: The Streaming API is currently under a limited alpha test,
details below.



Please test that your Streaming API clients can handle unexpected
objects in the markup stream. Status deletion notice support is  
being
added, but will be disabled until at least Thursday July 16th to  
allow

developers a chance to check their code. From the 
wiki,http://apiwiki.twitter.com/Streaming-API-Documentation:


Streams may also contain status deletion notices. Clients are  
urged to

honor deletion requests and discard deleted statuses immediately.



* XML:  deletestatusid1234/iduser_id3/user_id/
status/delete
* JSON: { delete: { status: { id: 1234, user_id: 3 } } }



Objects other than status and delete may be introduced into the
markup stream in a future release. Please ensure that your parser is
tolerant of unexpected objects.



Important Alpha Test Note:
The Streaming API (aka Hosebird) is currently under an alpha test.  
All
developers using the Streaming API must tolerate possible  
unannounced

and extended periods of unavailability, especially during off-hours,
Pacific Time. New features, resources and policies are being  
deployed
on very little, if any, notice. Any developer may experiment with  
the

unrestricted resources and provide feedback via this list. Access to
restricted resources is extremely limited and is only granted on a
case-by-case basis after acceptance of an additional terms of  
service

document.



-John Kalucki
twitter.com/jkalucki
Services, Twitter Inc.


[twitter-dev] Re: stopping bit.ly automatic shortening of urls

2009-07-08 Thread Joel Strellner


It goes back to the root of twitter originally being a SMS  
application. I recall hearing or reading someone on the Twitter team  
saying that.


-Joel

On Jul 8, 2009, at 1:31 PM, whoiskb whoi...@gmail.com wrote:



I am curious if there has ever been an official response from twitter
on why some simple HTML has not been allowed in a tweet?  If we were
able to use an anchor tag, and the HTML did not count against the 140
character limit, then the need for a URL shortener service would not
be needed.


On Jul 8, 10:27 am, sull sullele...@gmail.com wrote:

ironically, my example urls are shortened here ;)

On Jul 8, 12:20 pm, sull sullele...@gmail.com wrote:




this is a topic of interest to me for a long while.
been meaning to start a thread.



i'm often bothered by the automatic shortening of urls when in fact
the url does not need to be shortened.  in these cases, i of  
course do
not want to hide the real url by using a forced 3rd party service  
like

bit.ly.
i have use cases where all that is posted is a url.  and the url
includes a long detailed description of the link.  this, in my
opinion, is smart as the only object to maintain is the url itself
which provides a hyperlink and a short message combined.  sometimes,
these use cases are using natural language vanity urls to form short
sentences.


ie.http://john.tot.al.ly/wiped-out-on-this-huge-wave-in-hawaii-at- 
the-Su...


the other annoying thing that is related to the twitter UI is how  
long

urls are cut-off//trimmed even if they dont need to be.  the above
example would be destroyed because it would result in something
like:



http://john.tot.al.ly/wip


actually, i'm not certain if that is still the case as it seems to  
me

that every url is shortened with bit.ly now.  i grok the value in
tracking urls and bit.ly may be bought by twitter at some point and
this notion of url tracking will be fully integrated but the
debate about url shortners in general how they can break the
natural web, are vulnerable to massive broken links and simply thr
cryptic format itself that hides the true location are all to be
considered and continued to be debated.



at the very least, 3rd party developers should get an override
toggle.
that is something i think we all need to start demanding.



and yes, an official doc explaining the current and future
impementations of url shortening on twitter is definitely needed  
now.


http://plea.se/twitter-dont-shorten-this-url-with-bitly-since-it-does 
...



http://twitter.com/sull/status/2534470050



@sull



On Jul 8, 4:50 am, Swaroop rh.swar...@gmail.com wrote:



However, if you paste in a link that is less than 30 characters,
we'll post it in its entirety.  If it's longer than 30 characters,
we'll convert it to a shorter URL.



Source:http://help.twitter.com/portal


[twitter-dev] Re: @username matches and hash tags

2009-06-30 Thread Joel Strellner
I think usernames are a max of 20 characters, up from 15 a few months back.

 

-Joel

 

From: twitter-development-talk@googlegroups.com 
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of JDG
Sent: Tuesday, June 30, 2009 4:43 PM
To: twitter-development-talk@googlegroups.com
Subject: [twitter-dev] Re: @username matches and hash tags

 

i would imagine that for hashtags it's 139 characters. Can't speak to usernames 
though.

On Tue, Jun 30, 2009 at 17:39, Scott Haneda talkli...@newgeo.com wrote:


Awesome, thanks.  I found I had to use a B, so this works form me:
\B@([A-Za-z0-9_]+)

Word boundary is indeed very handy, works perfect, and the hash tag one works 
close enough.

I know there are limits on length, what are they for both hash tags and 
@usersnames?



On Jun 30, 2009, at 10:57 AM, Chad Etzel wrote:


For usernames, you could add the word boundary special character
(sometimes dependent on your programming language)

\b@([A-Za-z0-9_]+)

which should avoid email addresses.

For hashtags I use a similar one:

\b#([A-Za-z0-9_-]+)

I don't think #foo/bar is a valid hashtag, so I don't account for those.

 

-- 
Scott * If you contact me off list replace talklists@ with scott@ *




-- 
Internets. Serious business.



[twitter-dev] Re: Search API since_id doesn't work with filter:links

2009-06-24 Thread Joel Strellner

We are seeing this as well.

On Jun 24, 4:57 am, Mojosaurus ish...@gmail.com wrote:
 Hi,

 My script polls Twitter APIs once every 15 seconds with a query 
 likehttp://search.twitter.com/search.atom?q=video%20filter:linksrpp=100;...

 Starting 2009-06-23, this API returns http 403, with the following
 error message.
 hash
 errorsince date or since_id is too old/error
 /hash

 Did anything change in the last 24 hours? Is this a known issue, and
 when is it expected to get fixed? Any leads would be much appreciated.

 --
 thanks,
 Ishwar.


[twitter-dev] Re: Clarification on how @ messages and PM's are handled - PLEASE HELP

2009-06-24 Thread Joel Strellner

Hi Nicholas,

I don't work for twitter, but I believe that they associate ID's to
messages (tweets or DMs), and then associate the ID to each of the
users that it needs to go to.  The message is only stored once, unlike
email which is copied to each user.

If a user deletes their message, all the users that would ordinarily
see it, will no longer be able to see it.

If you send a message to @noname, and they do not exist, I believe it
is silently dropped.  It will still be in the senders sent history,
but since @noname doesn't exist, no one would see it unless they
looked at the senders tweets. Since there is no @noname available, it
is unable to map it to a user ID, and therefore, it wont appear (even
if someone created @noname later in the future)

-Joel

On Jun 24, 2:12 pm, Nicholas S nst...@gmail.com wrote:
 Hi Everyone,

 I would very much appreciate the groups help on this. I work for a
 local city government in Florida and we are trying to look into using
 Twitter to help make government more transparent and engaged with our
 community. BUT I need very technical information on how messaging is
 handled in twitter, to understand what needs to be retained for Public
 Records laws.

 1. Senerio set-up: User2 is replying to User1. When User2 replies to
 User1 using the message {...@user1 thats a great idea} is there any
 message sent (similar to how email is sent) where the message actually
 travels to User1?
 -or-
 Does User1's twitter program search User2's twitter posting looking
 for @User1 and then if found directs User1 to User2's content? Similar
 to a search engine where it (twitter) would run a search finding the
 content @User1 and then saying Here is a reply go here or get it from
 here.

 --- To get into it further if I reply to @noname and noname isnt a
 valid acount/ it doesnt exist. Does that message get transmitted
 anywhere.

 Lets start with that as its confusing enough. Basicly if a message is
 transmitted to a government on twitter they are public record and need
 to be retained.


[twitter-dev] Re: Spinn3r Twitter Social Media Rank

2009-06-15 Thread Joel Strellner

Kevin,

We'd be very interested in an API that could determine spammers from regular
users.  If you go this route, please get in touch, I'd LOVE to test it out
on our service, Twitturly.

-Joel

-Original Message-
From: twitter-development-talk@googlegroups.com
[mailto:twitter-development-t...@googlegroups.com] On Behalf Of burton
Sent: Monday, June 15, 2009 3:43 PM
To: Twitter Development Talk
Subject: [twitter-dev] Re: Spinn3r Twitter Social Media Rank


That's what we're thinking of experimenting with... perhaps an API
where you can give us a handle and we can tell you if it is spam or
ham.

Also ranking on certain topics (tech, politics, etc) would be pretty
hot.

If you have any ideas we're all ears

On Jun 15, 2:45 pm, Justyn Howard justyn.how...@gmail.com wrote:
 Well done. Considering an API so we could integrate rank data with other
 apps?

 On 6/15/09 3:43 PM, burton burtona...@gmail.com wrote:



  Hey guys.

  We just pushed this today:

 http://spinn3r.com/rank/twitter.php

  as part of our Spinn3r 3.1 release:

 http://blog.spinn3r.com/2009/06/spinn3r-31---now-with-twitter-support...
  al-media-ranking.html

  Would love feedback.

  If this is valuable for the community we would be willing to compute
  deeper rankings (on a deeper crawl) and recompute this more regularly
  (once every two weeks or so).

  Kevin



[twitter-dev] Re: Enable ability to block apps via Twitter or the API

2009-05-31 Thread Joel Strellner

I second this request. Ideally via both web and API, API being  
immediate and web when your UI guys can get to it.

On May 31, 2009, at 3:52 PM, Jesse Stay jesses...@gmail.com wrote:

 Not going to name names, but there are a few really noisy apps out  
 there right now.  It would be really nice if, via either the API (my  
 preference as it would be less work on your part and fits well with  
 my app), or the UI, you enabled users to block receiving Tweets  
 generated from specific apps.  This would then punish the app  
 developers for creating spammy apps and not the users themselves for  
 just using what was put out there, making it much less of a mess to  
 control.  Facebook does this, as does FriendFeed.  Any chance you  
 could enable this (please???) for Twitter?

 Thanks,

 @Jesse