Works for me :)

On Thu, Feb 7, 2013 at 8:57 AM, David Percy <[email protected]> wrote:

> site seems to be down, did you crash it by running that script?
> :-)
>
> On Wed, Feb 6, 2013 at 3:17 PM, Ragi Burhum <[email protected]> wrote:
> > Thank you Eric W., Eric R. Colin and Nicolas.
> >
> > All your links have been extremely useful in a particular way.
> >
> > I ended up tweaking Eric R's script to fit my needs. Great to know that
> at
> > least the 1/3 arcsecond files are available in a web accessible folder.
> > Nevertheless it would be nice if
> > http://tdds.cr.usgs.gov/ned/13arcsec/grid/grid_zips/ just listed all the
> > files instead of having to guess the file naming convention.
> >
> > For those interested, according to this announcement, the 1/9 arcsecond
> > datasets will be available from there soon.
> >
> > Thank you all again,
> >
> > - Ragi
> >
> >
> > On Wed, Feb 6, 2013 at 7:22 AM, Eric Wolf <[email protected]> wrote:
> >>
> >> USGS Eros has a service where you can ship hard drives to them to get
> bulk
> >> data, including NED:
> >>
> >> http://cumulus.cr.usgs.gov/bulk.php
> >>
> >> As a friend once said, "Never underestimate the bandwidth of a station
> >> wagon filled with backup tapes!"
> >>
> >> -Eric
> >>
> >> -=--=---=----=----=---=--=-=--=---=----=---=--=-=-
> >> Eric B. Wolf                           720-334-7734
> >>
> >>
> >>
> >>
> >> On Wed, Feb 6, 2013 at 7:31 AM, Eric Russell <[email protected]>
> >> wrote:
> >>>
> >>> Hi Ragi,
> >>>
> >>> Here's my Python code for downloading and unzipping the 1/3 arcsecond
> NED
> >>> from tdds.cr.usgs.gov:
> >>>
> >>> def download_zips ():
> >>>     for x in xrange(99, 66, -1):
> >>>         for y in xrange(48, 24, -1):
> >>>             try:
> >>>                 url =
> >>> "http://tdds.cr.usgs.gov/ned/13arcsec/grid/grid_zips/n%sw0%s.zip"; %
> (y, x)
> >>>                 dest = "D:\\ned\\zip\\n%sw%s.zip" % (y, x)
> >>>                 opener = urllib.URLopener()
> >>>                 opener.retrieve(url, dest)
> >>>                 print "downloaded /n%sw%s.zip" % (y, x)
> >>>             except IOError:
> >>>                 print "skipped /n%sw%s.zip" % (y, x)
> >>>
> >>> def unzip_files ():
> >>>     for zip_file in glob.iglob("D:\\ned\\zip\\*.zip"):
> >>>         print zip_file
> >>>         with zipfile.ZipFile(zip_file, 'r') as zip:
> >>>             for f in zip.infolist():
> >>>                 name_index = f.filename.find('/grd')
> >>>                 if name_index < 0:
> >>>                     continue
> >>>                 dest = os.path.join('E:', f.filename[name_index:])
> >>>                 if dest.endswith('/'):
> >>>                     os.makedirs(dest)
> >>>                 else:
> >>>                     with open(dest, 'wb') as fout:
> >>>                         fout.write(zip.read(f))
> >>>
> >>> Eric
> >>>
> >>>
> >>> On 2/5/13 11:33 PM, Ragi Burhum wrote:
> >>>
> >>> Hello all,
> >>>
> >>> I am trying to bulk download the entire 1/9 arc and 1/3 arc NED dataset
> >>> (I have the rest). I have a server with gigabit connection and several
> >>> terabytes of space - so space/network speed is not an issue. Any
> pointers on
> >>> how to get a hold of that dataset? I could write a scraping script,
> but I
> >>> would rather not go there.
> >>>
> >>> Thanks!
> >>>
> >>> - Ragi
> >>>
> >>>
> >>> _______________________________________________
> >>> Geowanking mailing list
> >>> [email protected]
> >>> http://geowanking.org/mailman/listinfo/geowanking_geowanking.org
> >>>
> >>>
> >>>
> >>> _______________________________________________
> >>> Geowanking mailing list
> >>> [email protected]
> >>> http://geowanking.org/mailman/listinfo/geowanking_geowanking.org
> >>>
> >>
> >
> >
> > _______________________________________________
> > Geowanking mailing list
> > [email protected]
> > http://geowanking.org/mailman/listinfo/geowanking_geowanking.org
> >
>
>
>
> --
> David Percy ("Percy")
> -Geospatial Data Manager
> -Web Map Wrangler
> -GIS Instructor
> Portland State University
> -gisgeek.pdx.edu
> -geology.pdx.edu
> -portlandpulse.org
>
_______________________________________________
Geowanking mailing list
[email protected]
http://geowanking.org/mailman/listinfo/geowanking_geowanking.org

Reply via email to