On 29 October 2010 10:53, Julian Todd <[email protected]> wrote:
> I am of the view that there has to be a platform continuum between
> writing code to do a proper scraper and parse, and simply rekeying the
> data.  What this means is if you happen to have a small data-set that
> you think would take a three hours to write a parser for and which is
> updated once a month, it may be more reasonable to cut-and-paste the
> data (or type it in), and then code up some some simple monitoring
> mechanism that sends out an email when the next page is uploaded.

Agreed.  Somewhere on that continuum would be software (like the
oldweather.org stuff) that assists in hand rekeying.  Allowing very
easy zooming in, jumping to the next entry in the table, and so on.

This paper http://www.springerlink.com/content/p58741881444nu11/
contains a long record (since 1819) for Minnesota, but I can't find it
in digital form anywhere.

drj

-- 
To post: [email protected]
To unsubscribe: [email protected]
Feeds: http://groups.google.com/group/python-north-west/feeds
More options: http://groups.google.com/group/python-north-west

Reply via email to