That's a valid concern that we share in our application.  We
dereference all shortened urls before indexing tweets.

In anticipation, supplies the API call
/api/stats/[short]/original<> to
grab the original url for archiving or displaying to end users.


All links are dereferenced by to be qualified before shortening.
 Currently in beta, we've set the qualifications a bit tight and urls that
redirect using some schemes will be rejected, and some bad http status
headers will also cause rejection.  This will be cleaned up a bit before
full public deployment.  At present, all urls use as the root domain
and are typically between 7 and 10 characters.

Screenshots are gathered via which works like

1.) If the full url exists in the cache its image is returned, then the url
is queued for a new shot.

2.) If the full url does not exist in the cache as a screenshot, the root
domain is looked up.  If the root domain is in the cache, that shot is
returned and the full url is queued for a new shot.

On Wed, Jul 15, 2009 at 12:34 PM, owkaye <> wrote:

> > Just wanted to let you guys know about a free service
> > we're prototyping for shortening URL's that overcomes a
> > few of the limitations of other shorteners.
> Only one problems with all these URL shorteners, when the
> companies creating them disappear all their shortened URLs
> become orphans and therefore useless.
> Not a major problem on Twitter because of the typical
> transience of data, but when you run a company like mine
> that needs to reference historic data it will definitely
> create future problems when these companies fail.
> Just something for folks to consider ...
> Owkaye

Kevin Mesiab
CEO, Mesiab Labs L.L.C.

Reply via email to