On 7 November 2010 18:06, [email protected] <[email protected]> wrote: > > So it looks like there's considerable interest in doing some group coding > around climate data. Perhaps it's time to crystallise our thoughts and give > some direction to the evening's event. It'll also give the eager amongst us > to brush up beforehand. Here's a first bash as a climate data uninitiate:
> What to code: > We've talked about parsing and transforming freely available climate data > into a format usable by ccc-gistemp. Can we have a pointer to the required > data format? I need to write more on this in a while, but basically, anything vaguely row like (one row per day, month, or year, depending) is better than being stuck in a spreadsheet or a PDF. I've done some Canada data recently, and i have a part in ScraperWiki that loads the data into their slightly curious named-tuple store, and another part (that i run on my laptop) that sucks it from ScraperWiki and formats it exactly as required for ccc-gistemp. I was imagining we'd work more on the former. > Data sources: > We've had suggestions as to the source of climate data to parse and > transform, including www.whatdotheyknow.com, www.oldweather.org and a paper > from www.springerlink.com. Between David and Julian, I'm guessing we'll > have more than enough data to chew on. I was mostly thinking of whatdotheyknow.com and/or national met agencies. the springerlink thing was a scanned PDF, so not much possibility for scraping by a program. > What to code in: > Python of course! And ScraperWiki (http://scraperwiki.com/) specifically? Python, yes (though I'm not going to stop anyone using anything else). > Do we envisage using ScraperWiki exclusively? Do we have ScraperWiki > experts out there who plan to attend? ScraperWiki is fun in it's own right. It provides some stuff that for me is a definite win: modules already installed (I hate installing python modules); a sort of cron-like scheduler; and automagically updating views. When I've done ScraperWiki development before, bits get gone on ScraperWiki, and bits get done on my laptop. More experimental stuff gets done on my laptop as often the ScraperWiki development process is "it mysteriously didn't work". > Tools: > We need people to bring their laptops! On the basis that we'll be pair > programming, the magic ratio will be one laptop for every two attendees. > I'm hoping either or both of David or Julian would be happy to give > direction at the event itself. You might like to get the show on the road > by giving a quick intro at the start on climate data (and possibly > ScraperWiki) ... :) I'm no ScraperWiki expert, but i can certainly introduce it. drj -- To post: [email protected] To unsubscribe: [email protected] Feeds: http://groups.google.com/group/python-north-west/feeds More options: http://groups.google.com/group/python-north-west
