That's a valid concern that we share in our retweet.com application.  We
dereference all shortened urls before indexing tweets.

In anticipation, rt.nu supplies the API call
/api/stats/[short]/original<http://rt.nu/api/stats/8kw/original> to
grab the original url for archiving or displaying to end users.

Dale:

All links are dereferenced by rt.nu to be qualified before shortening.
 Currently in beta, we've set the qualifications a bit tight and urls that
redirect using some schemes will be rejected, and some bad http status
headers will also cause rejection.  This will be cleaned up a bit before
full public deployment.  At present, all urls use rt.nu as the root domain
and are typically between 7 and 10 characters.

Screenshots are gathered via http://www.thumbshots.com/ which works like
this:

1.) If the full url exists in the cache its image is returned, then the url
is queued for a new shot.

2.) If the full url does not exist in the cache as a screenshot, the root
domain is looked up.  If the root domain is in the cache, that shot is
returned and the full url is queued for a new shot.



On Wed, Jul 15, 2009 at 12:34 PM, owkaye <owk...@gmail.com> wrote:

>
> > Just wanted to let you guys know about a free service
> > we're prototyping for shortening URL's that overcomes a
> > few of the limitations of other shorteners.
>
> Only one problems with all these URL shorteners, when the
> companies creating them disappear all their shortened URLs
> become orphans and therefore useless.
>
> Not a major problem on Twitter because of the typical
> transience of data, but when you run a company like mine
> that needs to reference historic data it will definitely
> create future problems when these companies fail.
>
> Just something for folks to consider ...
>
> Owkaye
>
>
>
>
>


-- 
Kevin Mesiab
CEO, Mesiab Labs L.L.C.
http://twitter.com/kmesiab
http://mesiablabs.com
http://retweet.com

Reply via email to