> Date: Wed, 13 Jul 2011 00:19:16 -0700 > From: Alex Mandel <[email protected]> > Subject: Re: [Qgis-user] Batch query web database > Cc: qgis-user <[email protected]> > Message-ID: <[email protected]> > Content-Type: text/plain; charset=ISO-8859-1 > > You could write a python script which hits the webpage with urllib (I > think that's the name) and then takes the return page and extracts out > the data you want saving down as a csv that can be joined to the table > or if a QGIS plugin put directly into the table. > > This is known as screen scraping and is a time honored way of getting > around the fact that a website doesn't give you what you need. If you're > worried about them noticing you can put a sleep timer to only make > request x times per second (assuming you want to do it all in batch). > > I have done this before with the Google/Yahoo geocoders which just limit > how many request per day. > > Enjoy, > Alex
If you don't in fact need to get all the data in a table (i.e. you only need to be able to look up the information for a particular site when you need it), you could create an "action" to download and display the information. This way you would always be dealing with up-to-date information, so you wouldn't need to redo the screen scraping periodically to update the table. Alister _______________________________________________ Qgis-user mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/qgis-user
