You can also generate your url:
repeat n 50 [
;?? n
url: join http://online.new....&page= [n "&scale=40"]
?? url
request-download/to url join %file- [n %.png] ; <- requires View
]
The download can also fail, (eg. due to a 60 sec timeout) so, if the
images are not too big, then you can catch the error and try again:
if error? set/any 'err try [request-download ...][
; error occurred!
; need to try again
]
Anton.
> Thanks Sunanda.
>
> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of
> [EMAIL PROTECTED]
> Sent: Friday, January 09, 2004 12:21 PM
> To: [EMAIL PROTECTED]
> Subject: [REBOL] Re: Beginner Help
>
>
> Hi Chris,
>
> > I would appreciate your assistance in creating a script. I would like to
> > download a list of binary files from a site and save the list
> to my hard
> > drive
>
> Play around with something like this (and watch the line wraps on the url
> when you cut'n'paste):
>
>
> file-list: [
>
> http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi
> .dll?file=
> 10152004010900000000001001&page=3&scale=40
>
> ..other urls here
> ]
>
> for nn 1 length? file-list 1
> [print ["reading file " nn]
> temp: read/binary file-list/:nn
> write/binary join %file- [nn ".png"] temp
> ]
> print "Done!!"
>
>
> And watch out for any copyright violations on those images too.
> Sunanda
--
To unsubscribe from this list, just send an email to
[EMAIL PROTECTED] with unsubscribe as the subject.